Oct 15, 2025 12:45:36 AM org.apache.karaf.main.Main launch INFO: Installing and starting initial bundles Oct 15, 2025 12:45:36 AM org.apache.karaf.main.Main launch INFO: All initial bundles installed and set to start Oct 15, 2025 12:45:36 AM org.apache.karaf.main.lock.SimpleFileLock lock INFO: Trying to lock /tmp/karaf-0.23.0/lock Oct 15, 2025 12:45:36 AM org.apache.karaf.main.lock.SimpleFileLock lock INFO: Lock acquired Oct 15, 2025 12:45:36 AM org.apache.karaf.main.Main$KarafLockCallback lockAcquired INFO: Lock acquired. Setting startlevel to 100 2025-10-15T00:45:38,246 | INFO | CM Configuration Updater (Update: pid=org.ops4j.pax.logging) | EventAdminConfigurationNotifier | 5 - org.ops4j.pax.logging.pax-logging-log4j2 - 2.3.0 | Logging configuration changed. (Event Admin service unavailable - no notification sent). 2025-10-15T00:45:39,955 | INFO | activator-1-thread-2 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Adding features: odl-jolokia/[11.0.2,11.0.2],93b0c803-4463-4530-be01-dde4997d3e80/[0,0.0.0],odl-openflowplugin-flow-services-rest/[0.20.1,0.20.1],odl-openflowplugin-app-bulk-o-matic/[0.20.1,0.20.1],odl-infrautils-ready/[7.1.7,7.1.7] 2025-10-15T00:45:40,138 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Changes to perform: 2025-10-15T00:45:40,138 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Region: root 2025-10-15T00:45:40,139 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Bundles to install: 2025-10-15T00:45:40,139 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.el/jakarta.el-api/3.0.3 2025-10-15T00:45:40,139 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.enterprise/cdi-api/2.0.SP1 2025-10-15T00:45:40,139 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.interceptor/javax.interceptor-api/1.2.2 2025-10-15T00:45:40,139 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.transaction/javax.transaction-api/1.2 2025-10-15T00:45:40,139 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.jasypt/1.9.3_1 2025-10-15T00:45:40,139 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.javax-inject/1_3 2025-10-15T00:45:40,139 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.jdbc/pax-jdbc/1.5.7 2025-10-15T00:45:40,139 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.jdbc/pax-jdbc-config/1.5.7 2025-10-15T00:45:40,139 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.jdbc/pax-jdbc-pool-common/1.5.7 2025-10-15T00:45:40,139 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.url/pax-url-wrap/2.6.17/jar/uber 2025-10-15T00:45:40,139 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.osgi/org.osgi.service.jdbc/1.1.0 2025-10-15T00:45:40,141 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Installing bundles: 2025-10-15T00:45:40,141 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.el/jakarta.el-api/3.0.3 2025-10-15T00:45:40,144 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.enterprise/cdi-api/2.0.SP1 2025-10-15T00:45:40,145 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.interceptor/javax.interceptor-api/1.2.2 2025-10-15T00:45:40,146 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.transaction/javax.transaction-api/1.2 2025-10-15T00:45:40,147 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.jasypt/1.9.3_1 2025-10-15T00:45:40,148 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.javax-inject/1_3 2025-10-15T00:45:40,149 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.jdbc/pax-jdbc/1.5.7 2025-10-15T00:45:40,150 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.jdbc/pax-jdbc-config/1.5.7 2025-10-15T00:45:40,151 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.jdbc/pax-jdbc-pool-common/1.5.7 2025-10-15T00:45:40,152 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.url/pax-url-wrap/2.6.17/jar/uber 2025-10-15T00:45:40,155 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.osgi/org.osgi.service.jdbc/1.1.0 2025-10-15T00:45:40,185 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Starting bundles: 2025-10-15T00:45:40,187 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.url.wrap/2.6.17 2025-10-15T00:45:40,191 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.interceptor-api/1.2.2 2025-10-15T00:45:40,191 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.el-api/3.0.3 2025-10-15T00:45:40,191 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 2025-10-15T00:45:40,191 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.enterprise.cdi-api/2.0.0.SP1 2025-10-15T00:45:40,192 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.jasypt/1.9.3.1 2025-10-15T00:45:40,192 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.transaction-api/1.2.0 2025-10-15T00:45:40,193 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.osgi.service.jdbc/1.1.0.202212101352 2025-10-15T00:45:40,193 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc.pool.common/1.5.7 2025-10-15T00:45:40,194 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc.config/1.5.7 2025-10-15T00:45:40,199 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc/1.5.7 2025-10-15T00:45:40,205 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Done. 2025-10-15T00:45:42,729 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Changes to perform: 2025-10-15T00:45:42,729 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Region: root 2025-10-15T00:45:42,729 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Bundles to uninstall: 2025-10-15T00:45:42,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 2025-10-15T00:45:42,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Bundles to install: 2025-10-15T00:45:42,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.checkerframework/checker-qual/3.50.0 2025-10-15T00:45:42,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.google.code.gson/gson/2.13.1 2025-10-15T00:45:42,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.google.guava/guava/33.4.8-jre 2025-10-15T00:45:42,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.google.guava/failureaccess/1.0.3 2025-10-15T00:45:42,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.googlecode.json-simple/json-simple/1.1.1 2025-10-15T00:45:42,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.h2database/h2/2.3.232 2025-10-15T00:45:42,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.rabbitmq/amqp-client/5.26.0 2025-10-15T00:45:42,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.typesafe/config/1.4.3 2025-10-15T00:45:42,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.typesafe/ssl-config-core_2.13/0.6.1 2025-10-15T00:45:42,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.aeron/aeron-client/1.38.1 2025-10-15T00:45:42,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.aeron/aeron-driver/1.38.1 2025-10-15T00:45:42,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-core/4.2.36 2025-10-15T00:45:42,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-graphite/4.2.36 2025-10-15T00:45:42,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-healthchecks/4.2.36 2025-10-15T00:45:42,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-jmx/4.2.36 2025-10-15T00:45:42,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-jvm/4.2.36 2025-10-15T00:45:42,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-buffer/4.2.6.Final 2025-10-15T00:45:42,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-base/4.2.6.Final 2025-10-15T00:45:42,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-compression/4.2.6.Final 2025-10-15T00:45:42,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-http/4.2.6.Final 2025-10-15T00:45:42,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-http2/4.2.6.Final 2025-10-15T00:45:42,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-common/4.2.6.Final 2025-10-15T00:45:42,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-handler/4.2.6.Final 2025-10-15T00:45:42,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-resolver/4.2.6.Final 2025-10-15T00:45:42,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport/4.2.6.Final 2025-10-15T00:45:42,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport-classes-epoll/4.2.6.Final 2025-10-15T00:45:42,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport-native-epoll/4.2.6.Final/jar/linux-x86_64 2025-10-15T00:45:42,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport-native-unix-common/4.2.6.Final 2025-10-15T00:45:42,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.activation/jakarta.activation-api/1.2.2 2025-10-15T00:45:42,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.annotation/jakarta.annotation-api/1.3.5 2025-10-15T00:45:42,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.servlet/jakarta.servlet-api/4.0.4 2025-10-15T00:45:42,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.validation/jakarta.validation-api/2.0.2 2025-10-15T00:45:42,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.ws.rs/jakarta.ws.rs-api/2.1.6 2025-10-15T00:45:42,733 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.javassist/javassist/3.30.2-GA 2025-10-15T00:45:42,733 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.servlet/javax.servlet-api/3.1.0 2025-10-15T00:45:42,733 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.websocket/jakarta.websocket-api/1.1.2 2025-10-15T00:45:42,733 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.odlparent/karaf.branding/14.1.3 2025-10-15T00:45:42,733 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.lz4/lz4-java/1.8.0 2025-10-15T00:45:42,733 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:net.bytebuddy/byte-buddy/1.17.7 2025-10-15T00:45:42,733 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.agrona/agrona/1.15.2 2025-10-15T00:45:42,733 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.antlr/antlr4-runtime/4.13.2 2025-10-15T00:45:42,733 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.api/1.0.1 2025-10-15T00:45:42,733 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.cm/1.3.2 2025-10-15T00:45:42,733 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.core/1.10.3 2025-10-15T00:45:42,733 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.api/1.1.5 2025-10-15T00:45:42,733 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.blueprint.api/1.2.0 2025-10-15T00:45:42,734 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.blueprint.core/1.2.0 2025-10-15T00:45:42,734 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.core/1.1.8 2025-10-15T00:45:42,734 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.whiteboard/1.2.0 2025-10-15T00:45:42,734 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.proxy/org.apache.aries.proxy/1.1.14 2025-10-15T00:45:42,734 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.quiesce/org.apache.aries.quiesce.api/1.0.0 2025-10-15T00:45:42,734 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries/org.apache.aries.util/1.1.3 2025-10-15T00:45:42,734 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:commons-collections/commons-collections/3.2.2 2025-10-15T00:45:42,734 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:commons-beanutils/commons-beanutils/1.11.0 2025-10-15T00:45:42,734 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:commons-codec/commons-codec/1.19.0 2025-10-15T00:45:42,734 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.commons/commons-lang3/3.18.0 2025-10-15T00:45:42,734 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.commons/commons-text/1.14.0 2025-10-15T00:45:42,734 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.felix/org.apache.felix.scr/2.2.6 2025-10-15T00:45:42,734 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.geronimo.specs/geronimo-atinject_1.0_spec/1.2 2025-10-15T00:45:42,734 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.bundle/org.apache.karaf.bundle.blueprintstate/4.4.8 2025-10-15T00:45:42,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.bundle/org.apache.karaf.bundle.core/4.4.8 2025-10-15T00:45:42,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.config/org.apache.karaf.config.command/4.4.8 2025-10-15T00:45:42,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.blueprint/4.4.8 2025-10-15T00:45:42,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.features/4.4.8 2025-10-15T00:45:42,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.kar/4.4.8 2025-10-15T00:45:42,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.wrap/4.4.8 2025-10-15T00:45:42,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.diagnostic/org.apache.karaf.diagnostic.boot/4.4.8 2025-10-15T00:45:42,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.diagnostic/org.apache.karaf.diagnostic.core/4.4.8 2025-10-15T00:45:42,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.features/org.apache.karaf.features.command/4.4.8 2025-10-15T00:45:42,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.http/org.apache.karaf.http.core/4.4.8 2025-10-15T00:45:42,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.instance/org.apache.karaf.instance.core/4.4.8 2025-10-15T00:45:42,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.command/4.4.8 2025-10-15T00:45:42,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.config/4.4.8 2025-10-15T00:45:42,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.modules/4.4.8 2025-10-15T00:45:42,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jdbc/org.apache.karaf.jdbc.core/4.4.8 2025-10-15T00:45:42,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.kar/org.apache.karaf.kar.core/4.4.8 2025-10-15T00:45:42,736 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.log/org.apache.karaf.log.core/4.4.8 2025-10-15T00:45:42,736 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.management/org.apache.karaf.management.server/4.4.8 2025-10-15T00:45:42,736 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.package/org.apache.karaf.package.core/4.4.8 2025-10-15T00:45:42,736 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.scr/org.apache.karaf.scr.management/4.4.8 2025-10-15T00:45:42,736 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.scr/org.apache.karaf.scr.state/4.4.8 2025-10-15T00:45:42,736 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.service/org.apache.karaf.service.core/4.4.8 2025-10-15T00:45:42,736 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.commands/4.4.8 2025-10-15T00:45:42,736 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.console/4.4.8 2025-10-15T00:45:42,736 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.core/4.4.8 2025-10-15T00:45:42,736 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.ssh/4.4.8 2025-10-15T00:45:42,736 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.table/4.4.8 2025-10-15T00:45:42,736 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.system/org.apache.karaf.system.core/4.4.8 2025-10-15T00:45:42,736 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.web/org.apache.karaf.web.core/4.4.8 2025-10-15T00:45:42,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.sshd/sshd-osgi/2.15.0 2025-10-15T00:45:42,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.sshd/sshd-scp/2.15.0 2025-10-15T00:45:42,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.sshd/sshd-sftp/2.15.0 2025-10-15T00:45:42,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jdt/ecj/3.26.0 2025-10-15T00:45:42,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-client/9.4.57.v20241219 2025-10-15T00:45:42,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-continuation/9.4.57.v20241219 2025-10-15T00:45:42,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-http/9.4.57.v20241219 2025-10-15T00:45:42,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-io/9.4.57.v20241219 2025-10-15T00:45:42,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-jaas/9.4.57.v20241219 2025-10-15T00:45:42,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-jmx/9.4.57.v20241219 2025-10-15T00:45:42,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-security/9.4.57.v20241219 2025-10-15T00:45:42,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-server/9.4.57.v20241219 2025-10-15T00:45:42,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-servlet/9.4.57.v20241219 2025-10-15T00:45:42,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-servlets/9.4.57.v20241219 2025-10-15T00:45:42,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-util/9.4.57.v20241219 2025-10-15T00:45:42,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-util-ajax/9.4.57.v20241219 2025-10-15T00:45:42,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-xml/9.4.57.v20241219 2025-10-15T00:45:42,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/hk2-api/2.6.1 2025-10-15T00:45:42,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2.external/aopalliance-repackaged/2.6.1 2025-10-15T00:45:42,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/hk2-locator/2.6.1 2025-10-15T00:45:42,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/osgi-resource-locator/1.0.3 2025-10-15T00:45:42,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/hk2-utils/2.6.1 2025-10-15T00:45:42,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.containers/jersey-container-servlet/2.47 2025-10-15T00:45:42,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.containers/jersey-container-servlet-core/2.47 2025-10-15T00:45:42,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.core/jersey-client/2.47 2025-10-15T00:45:42,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.core/jersey-common/2.47 2025-10-15T00:45:42,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.core/jersey-server/2.47 2025-10-15T00:45:42,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.inject/jersey-hk2/2.47 2025-10-15T00:45:42,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.media/jersey-media-sse/2.47 2025-10-15T00:45:42,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.jline/jline/3.21.0 2025-10-15T00:45:42,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.jolokia/jolokia-osgi/1.7.2 2025-10-15T00:45:42,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.jspecify/jspecify/1.0.0 2025-10-15T00:45:42,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm/9.8 2025-10-15T00:45:42,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-commons/9.8 2025-10-15T00:45:42,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-tree/9.8 2025-10-15T00:45:42,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-analysis/9.8 2025-10-15T00:45:42,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-util/9.8 2025-10-15T00:45:42,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-authn-api/0.21.2 2025-10-15T00:45:42,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-cert/0.21.2 2025-10-15T00:45:42,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-encrypt-service/0.21.2 2025-10-15T00:45:42,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-encrypt-service-impl/0.21.2 2025-10-15T00:45:42,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-filterchain/0.21.2 2025-10-15T00:45:42,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-idm-store-h2/0.21.2 2025-10-15T00:45:42,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-jetty-auth-log-filter/0.21.2 2025-10-15T00:45:42,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-password-service-api/0.21.2 2025-10-15T00:45:42,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-password-service-impl/0.21.2 2025-10-15T00:45:42,740 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/repackaged-shiro/0.21.2 2025-10-15T00:45:42,740 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-shiro/0.21.2 2025-10-15T00:45:42,740 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-shiro-api/0.21.2 2025-10-15T00:45:42,740 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-tokenauthrealm/0.21.2 2025-10-15T00:45:42,740 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/web-api/0.21.2 2025-10-15T00:45:42,740 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/web-osgi-impl/0.21.2 2025-10-15T00:45:42,740 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/servlet-api/0.21.2 2025-10-15T00:45:42,740 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/servlet-jersey2/0.21.2 2025-10-15T00:45:42,740 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/atomix-storage/11.0.2 2025-10-15T00:45:42,740 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/blueprint/11.0.2 2025-10-15T00:45:42,740 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-access-api/11.0.2 2025-10-15T00:45:42,740 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-access-client/11.0.2 2025-10-15T00:45:42,740 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-dom-api/11.0.2 2025-10-15T00:45:42,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-mgmt-api/11.0.2 2025-10-15T00:45:42,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/eos-dom-akka/11.0.2 2025-10-15T00:45:42,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/raft-api/11.0.2 2025-10-15T00:45:42,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/raft-journal/11.0.2 2025-10-15T00:45:42,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/raft-spi/11.0.2 2025-10-15T00:45:42,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/repackaged-pekko/11.0.2 2025-10-15T00:45:42,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-akka-raft/11.0.2 2025-10-15T00:45:42,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-akka-segmented-journal/11.0.2 2025-10-15T00:45:42,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-cluster-admin-api/11.0.2 2025-10-15T00:45:42,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-cluster-admin-impl/11.0.2 2025-10-15T00:45:42,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-clustering-commons/11.0.2 2025-10-15T00:45:42,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-common-util/11.0.2 2025-10-15T00:45:42,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-distributed-datastore/11.0.2 2025-10-15T00:45:42,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-remoterpc-connector/11.0.2 2025-10-15T00:45:42,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/diagstatus-api/7.1.7 2025-10-15T00:45:42,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/diagstatus-impl/7.1.7 2025-10-15T00:45:42,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/diagstatus-shell/7.1.7 2025-10-15T00:45:42,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/ready-api/7.1.7 2025-10-15T00:45:42,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/ready-impl/7.1.7 2025-10-15T00:45:42,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/infrautils-util/7.1.7 2025-10-15T00:45:42,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-dom-adapter/14.0.18 2025-10-15T00:45:42,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-util/14.0.18 2025-10-15T00:45:42,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-crypt-hash/14.0.18 2025-10-15T00:45:42,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-encryption-algs/14.0.18 2025-10-15T00:45:42,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-key-exchange-algs/14.0.18 2025-10-15T00:45:42,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-mac-algs/14.0.18 2025-10-15T00:45:42,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-public-key-algs/14.0.18 2025-10-15T00:45:42,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-tls-cipher-suite-algs/14.0.18 2025-10-15T00:45:42,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6241/14.0.18 2025-10-15T00:45:42,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6243/14.0.18 2025-10-15T00:45:42,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6470/14.0.18 2025-10-15T00:45:42,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6991-ietf-inet-types/14.0.18 2025-10-15T00:45:42,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6991-ietf-yang-types/14.0.18 2025-10-15T00:45:42,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc7407-ietf-x509-cert-to-name/14.0.18 2025-10-15T00:45:42,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc7952/14.0.18 2025-10-15T00:45:42,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8040-ietf-restconf/14.0.18 2025-10-15T00:45:42,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8040-ietf-restconf-monitoring/14.0.18 2025-10-15T00:45:42,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8072/14.0.18 2025-10-15T00:45:42,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8341/14.0.18 2025-10-15T00:45:42,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8342-ietf-datastores/14.0.18 2025-10-15T00:45:42,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8342-ietf-origin/14.0.18 2025-10-15T00:45:42,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8343/14.0.18 2025-10-15T00:45:42,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8344/14.0.18 2025-10-15T00:45:42,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8525/14.0.18 2025-10-15T00:45:42,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8526/14.0.18 2025-10-15T00:45:42,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8528/14.0.18 2025-10-15T00:45:42,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8529/14.0.18 2025-10-15T00:45:42,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8639/14.0.18 2025-10-15T00:45:42,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8650/14.0.18 2025-10-15T00:45:42,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9640/14.0.18 2025-10-15T00:45:42,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9641/14.0.18 2025-10-15T00:45:42,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9642/14.0.18 2025-10-15T00:45:42,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9643-ietf-tcp-client/14.0.18 2025-10-15T00:45:42,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9643-ietf-tcp-common/14.0.18 2025-10-15T00:45:42,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9643-ietf-tcp-server/14.0.18 2025-10-15T00:45:42,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9644-ietf-ssh-client/14.0.18 2025-10-15T00:45:42,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9644-ietf-ssh-common/14.0.18 2025-10-15T00:45:42,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9644-ietf-ssh-server/14.0.18 2025-10-15T00:45:42,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9645-ietf-tls-client/14.0.18 2025-10-15T00:45:42,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9645-ietf-tls-common/14.0.18 2025-10-15T00:45:42,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9645-ietf-tls-server/14.0.18 2025-10-15T00:45:42,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-binding-adapter/14.0.18 2025-10-15T00:45:42,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-api/14.0.18 2025-10-15T00:45:42,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-spi/14.0.18 2025-10-15T00:45:42,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-common-api/14.0.18 2025-10-15T00:45:42,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-api/14.0.18 2025-10-15T00:45:42,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-broker/14.0.18 2025-10-15T00:45:42,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-schema-osgi/14.0.18 2025-10-15T00:45:42,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-spi/14.0.18 2025-10-15T00:45:42,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-binding-api/14.0.18 2025-10-15T00:45:42,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-common-api/14.0.18 2025-10-15T00:45:42,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-dom-api/14.0.18 2025-10-15T00:45:42,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-singleton-api/14.0.18 2025-10-15T00:45:42,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-singleton-impl/14.0.18 2025-10-15T00:45:42,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/general-entity/14.0.18 2025-10-15T00:45:42,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/ietf-topology/2013.10.21.26.18 2025-10-15T00:45:42,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/ietf-type-util/14.0.18 2025-10-15T00:45:42,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/opendaylight-l2-types/2013.08.27.26.18 2025-10-15T00:45:42,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/yang-ext/2013.09.07.26.18 2025-10-15T00:45:42,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/databind/9.0.1 2025-10-15T00:45:42,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/netconf-dom-api/9.0.1 2025-10-15T00:45:42,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/keystore-api/9.0.1 2025-10-15T00:45:42,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/keystore-none/9.0.1 2025-10-15T00:45:42,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf.model/draft-ietf-restconf-server/9.0.1 2025-10-15T00:45:42,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf.model/rfc5277/9.0.1 2025-10-15T00:45:42,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf.model/sal-remote/9.0.1 2025-10-15T00:45:42,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/netconf-api/9.0.1 2025-10-15T00:45:42,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/netconf-common-mdsal/9.0.1 2025-10-15T00:45:42,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/odl-device-notification/9.0.1 2025-10-15T00:45:42,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-api/9.0.1 2025-10-15T00:45:42,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-mdsal-spi/9.0.1 2025-10-15T00:45:42,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-nb/9.0.1 2025-10-15T00:45:42,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server/9.0.1 2025-10-15T00:45:42,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-api/9.0.1 2025-10-15T00:45:42,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-jaxrs/9.0.1 2025-10-15T00:45:42,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-mdsal/9.0.1 2025-10-15T00:45:42,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-spi/9.0.1 2025-10-15T00:45:42,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-subscription/9.0.1 2025-10-15T00:45:42,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/sal-remote-impl/9.0.1 2025-10-15T00:45:42,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/shaded-sshd/9.0.1 2025-10-15T00:45:42,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-api/9.0.1 2025-10-15T00:45:42,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-http/9.0.1 2025-10-15T00:45:42,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-ssh/9.0.1 2025-10-15T00:45:42,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-tcp/9.0.1 2025-10-15T00:45:42,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-tls/9.0.1 2025-10-15T00:45:42,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/truststore-api/9.0.1 2025-10-15T00:45:42,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/truststore-none/9.0.1 2025-10-15T00:45:42,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/yanglib-mdsal-writer/9.0.1 2025-10-15T00:45:42,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.odlparent/bundles-diag/14.1.3 2025-10-15T00:45:42,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin/0.20.1 2025-10-15T00:45:42,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-api/0.20.1 2025-10-15T00:45:42,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/arbitratorreconciliation-api/0.20.1 2025-10-15T00:45:42,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/arbitratorreconciliation-impl/0.20.1 2025-10-15T00:45:42,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/bulk-o-matic/0.20.1 2025-10-15T00:45:42,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/device-ownership-service/0.20.1 2025-10-15T00:45:42,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/forwardingrules-manager/0.20.1 2025-10-15T00:45:42,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/lldp-speaker/0.20.1 2025-10-15T00:45:42,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/of-switch-config-pusher/0.20.1 2025-10-15T00:45:42,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/reconciliation-framework/0.20.1 2025-10-15T00:45:42,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/topology-lldp-discovery/0.20.1 2025-10-15T00:45:42,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/topology-manager/0.20.1 2025-10-15T00:45:42,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-blueprint-config/0.20.1 2025-10-15T00:45:42,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-common/0.20.1 2025-10-15T00:45:42,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-extension-api/0.20.1 2025-10-15T00:45:42,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-extension-onf/0.20.1 2025-10-15T00:45:42,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-impl/0.20.1 2025-10-15T00:45:42,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.libraries/liblldp/0.20.1 2025-10-15T00:45:42,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-flow-base/0.20.1 2025-10-15T00:45:42,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-flow-service/0.20.1 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-flow-statistics/0.20.1 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-inventory/0.20.1 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-topology/0.20.1 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflowjava-blueprint-config/0.20.1 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-api/0.20.1 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-impl/0.20.1 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-spi/0.20.1 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflowjava-util/0.20.1 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/srm-api/0.20.1 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/srm-impl/0.20.1 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/srm-shell/0.20.1 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-api/14.0.17 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-dynamic/14.0.17 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-osgi/14.0.17 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-spi/14.0.17 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-generator/14.0.17 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-loader/14.0.17 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-model/14.0.17 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-reflect/14.0.17 2025-10-15T00:45:42,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-runtime-api/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-runtime-osgi/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-runtime-spi/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-spec/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/codegen-extensions/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/concepts/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/odlext-model-api/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/odlext-parser-support/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/openconfig-model-api/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/openconfig-parser-support/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6241-model-api/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6241-parser-support/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6536-model-api/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6536-parser-support/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6643-model-api/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6643-parser-support/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc7952-model-api/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc7952-parser-support/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8040-model-api/14.0.17 2025-10-15T00:45:42,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8040-parser-support/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8528-model-api/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8528-parser-support/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8639-model-api/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8639-parser-support/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8819-model-api/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8819-parser-support/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/util/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-common/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-common-netty/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-api/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-codec-binfmt/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-codec-gson/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-codec-xml/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-impl/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-spi/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-transform/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-tree-api/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-tree-ri/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-tree-spi/14.0.17 2025-10-15T00:45:42,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-util/14.0.17 2025-10-15T00:45:42,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-ir/14.0.17 2025-10-15T00:45:42,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-api/14.0.17 2025-10-15T00:45:42,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-export/14.0.17 2025-10-15T00:45:42,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-ri/14.0.17 2025-10-15T00:45:42,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-spi/14.0.17 2025-10-15T00:45:42,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-util/14.0.17 2025-10-15T00:45:42,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-api/14.0.17 2025-10-15T00:45:42,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-impl/14.0.17 2025-10-15T00:45:42,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-reactor/14.0.17 2025-10-15T00:45:42,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-rfc7950/14.0.17 2025-10-15T00:45:42,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-spi/14.0.17 2025-10-15T00:45:42,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-repo-api/14.0.17 2025-10-15T00:45:42,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-repo-fs/14.0.17 2025-10-15T00:45:42,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-repo-spi/14.0.17 2025-10-15T00:45:42,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-xpath-api/14.0.17 2025-10-15T00:45:42,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-xpath-impl/14.0.17 2025-10-15T00:45:42,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.url/pax-url-war/2.6.17/jar/uber 2025-10-15T00:45:42,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-api/8.0.33 2025-10-15T00:45:42,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-compatibility-el2/8.0.33 2025-10-15T00:45:42,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-compatibility-servlet31/8.0.33 2025-10-15T00:45:42,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-extender-war/8.0.33 2025-10-15T00:45:42,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-extender-whiteboard/8.0.33 2025-10-15T00:45:42,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-jetty/8.0.33 2025-10-15T00:45:42,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-jsp/8.0.33 2025-10-15T00:45:42,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-runtime/8.0.33 2025-10-15T00:45:42,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-spi/8.0.33 2025-10-15T00:45:42,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-tomcat-common/8.0.33 2025-10-15T00:45:42,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-websocket/8.0.33 2025-10-15T00:45:42,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.osgi/org.osgi.service.component/1.5.1 2025-10-15T00:45:42,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.owasp.encoder/encoder/1.3.1 2025-10-15T00:45:42,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.scala-lang/scala-library/2.13.16 2025-10-15T00:45:42,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.reactivestreams/reactive-streams/1.0.4 2025-10-15T00:45:42,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.codehaus.woodstox/stax2-api/4.2.2 2025-10-15T00:45:42,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:tech.pantheon.triemap/triemap/1.3.2 2025-10-15T00:45:42,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | wrap:mvn:net.java.dev.stax-utils/stax-utils/20070216 2025-10-15T00:45:42,753 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | wrap:mvn:org.lmdbjava/lmdbjava/0.7.0 2025-10-15T00:45:42,754 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Stopping bundles: 2025-10-15T00:45:42,754 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc.pool.common/1.5.7 2025-10-15T00:45:42,755 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 2025-10-15T00:45:42,755 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.jasypt/1.9.3.1 2025-10-15T00:45:42,756 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.transaction-api/1.2.0 2025-10-15T00:45:42,756 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.enterprise.cdi-api/2.0.0.SP1 2025-10-15T00:45:42,756 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.el-api/3.0.3 2025-10-15T00:45:42,756 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc.config/1.5.7 2025-10-15T00:45:42,757 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Uninstalling bundles: 2025-10-15T00:45:42,757 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 2025-10-15T00:45:42,758 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Installing bundles: 2025-10-15T00:45:42,759 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.checkerframework/checker-qual/3.50.0 2025-10-15T00:45:42,760 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.google.code.gson/gson/2.13.1 2025-10-15T00:45:42,762 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.google.guava/guava/33.4.8-jre 2025-10-15T00:45:42,766 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.google.guava/failureaccess/1.0.3 2025-10-15T00:45:42,767 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.googlecode.json-simple/json-simple/1.1.1 2025-10-15T00:45:42,768 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.h2database/h2/2.3.232 2025-10-15T00:45:42,772 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.rabbitmq/amqp-client/5.26.0 2025-10-15T00:45:42,774 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.typesafe/config/1.4.3 2025-10-15T00:45:42,775 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.typesafe/ssl-config-core_2.13/0.6.1 2025-10-15T00:45:42,776 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.aeron/aeron-client/1.38.1 2025-10-15T00:45:42,777 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.aeron/aeron-driver/1.38.1 2025-10-15T00:45:42,779 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-core/4.2.36 2025-10-15T00:45:42,780 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-graphite/4.2.36 2025-10-15T00:45:42,780 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-healthchecks/4.2.36 2025-10-15T00:45:42,781 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-jmx/4.2.36 2025-10-15T00:45:42,782 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-jvm/4.2.36 2025-10-15T00:45:42,782 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-buffer/4.2.6.Final 2025-10-15T00:45:42,784 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-base/4.2.6.Final 2025-10-15T00:45:42,785 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-compression/4.2.6.Final 2025-10-15T00:45:42,786 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-http/4.2.6.Final 2025-10-15T00:45:42,787 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-http2/4.2.6.Final 2025-10-15T00:45:42,789 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-common/4.2.6.Final 2025-10-15T00:45:42,791 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-handler/4.2.6.Final 2025-10-15T00:45:42,792 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-resolver/4.2.6.Final 2025-10-15T00:45:42,793 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport/4.2.6.Final 2025-10-15T00:45:42,795 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport-classes-epoll/4.2.6.Final 2025-10-15T00:45:42,796 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport-native-epoll/4.2.6.Final/jar/linux-x86_64 2025-10-15T00:45:42,797 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport-native-unix-common/4.2.6.Final 2025-10-15T00:45:42,798 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.activation/jakarta.activation-api/1.2.2 2025-10-15T00:45:42,799 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.annotation/jakarta.annotation-api/1.3.5 2025-10-15T00:45:42,799 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.servlet/jakarta.servlet-api/4.0.4 2025-10-15T00:45:42,800 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.validation/jakarta.validation-api/2.0.2 2025-10-15T00:45:42,801 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.ws.rs/jakarta.ws.rs-api/2.1.6 2025-10-15T00:45:42,802 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.javassist/javassist/3.30.2-GA 2025-10-15T00:45:42,804 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.servlet/javax.servlet-api/3.1.0 2025-10-15T00:45:42,805 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.websocket/jakarta.websocket-api/1.1.2 2025-10-15T00:45:42,805 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.odlparent/karaf.branding/14.1.3 2025-10-15T00:45:42,806 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.lz4/lz4-java/1.8.0 2025-10-15T00:45:42,807 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:net.bytebuddy/byte-buddy/1.17.7 2025-10-15T00:45:42,820 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.agrona/agrona/1.15.2 2025-10-15T00:45:42,821 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.antlr/antlr4-runtime/4.13.2 2025-10-15T00:45:42,822 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.api/1.0.1 2025-10-15T00:45:42,823 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.cm/1.3.2 2025-10-15T00:45:42,824 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.core/1.10.3 2025-10-15T00:45:42,825 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.api/1.1.5 2025-10-15T00:45:42,826 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.blueprint.api/1.2.0 2025-10-15T00:45:42,827 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.blueprint.core/1.2.0 2025-10-15T00:45:42,827 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.core/1.1.8 2025-10-15T00:45:42,828 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.whiteboard/1.2.0 2025-10-15T00:45:42,829 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.proxy/org.apache.aries.proxy/1.1.14 2025-10-15T00:45:42,830 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.quiesce/org.apache.aries.quiesce.api/1.0.0 2025-10-15T00:45:42,831 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries/org.apache.aries.util/1.1.3 2025-10-15T00:45:42,850 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:commons-collections/commons-collections/3.2.2 2025-10-15T00:45:42,853 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:commons-beanutils/commons-beanutils/1.11.0 2025-10-15T00:45:42,855 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:commons-codec/commons-codec/1.19.0 2025-10-15T00:45:42,859 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.commons/commons-lang3/3.18.0 2025-10-15T00:45:42,861 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.commons/commons-text/1.14.0 2025-10-15T00:45:42,862 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.felix/org.apache.felix.scr/2.2.6 2025-10-15T00:45:42,863 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.geronimo.specs/geronimo-atinject_1.0_spec/1.2 2025-10-15T00:45:42,864 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.bundle/org.apache.karaf.bundle.blueprintstate/4.4.8 2025-10-15T00:45:42,865 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.bundle/org.apache.karaf.bundle.core/4.4.8 2025-10-15T00:45:42,866 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.config/org.apache.karaf.config.command/4.4.8 2025-10-15T00:45:42,867 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.blueprint/4.4.8 2025-10-15T00:45:42,867 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.features/4.4.8 2025-10-15T00:45:42,868 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.kar/4.4.8 2025-10-15T00:45:42,869 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.wrap/4.4.8 2025-10-15T00:45:42,870 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.diagnostic/org.apache.karaf.diagnostic.boot/4.4.8 2025-10-15T00:45:42,870 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.diagnostic/org.apache.karaf.diagnostic.core/4.4.8 2025-10-15T00:45:42,871 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.features/org.apache.karaf.features.command/4.4.8 2025-10-15T00:45:42,872 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.http/org.apache.karaf.http.core/4.4.8 2025-10-15T00:45:42,875 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.instance/org.apache.karaf.instance.core/4.4.8 2025-10-15T00:45:42,876 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.command/4.4.8 2025-10-15T00:45:42,877 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.config/4.4.8 2025-10-15T00:45:42,877 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.modules/4.4.8 2025-10-15T00:45:42,880 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jdbc/org.apache.karaf.jdbc.core/4.4.8 2025-10-15T00:45:42,881 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.kar/org.apache.karaf.kar.core/4.4.8 2025-10-15T00:45:42,882 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.log/org.apache.karaf.log.core/4.4.8 2025-10-15T00:45:42,883 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.management/org.apache.karaf.management.server/4.4.8 2025-10-15T00:45:42,884 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.package/org.apache.karaf.package.core/4.4.8 2025-10-15T00:45:42,885 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.scr/org.apache.karaf.scr.management/4.4.8 2025-10-15T00:45:42,885 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.scr/org.apache.karaf.scr.state/4.4.8 2025-10-15T00:45:42,886 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.service/org.apache.karaf.service.core/4.4.8 2025-10-15T00:45:42,887 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.commands/4.4.8 2025-10-15T00:45:42,888 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.console/4.4.8 2025-10-15T00:45:42,890 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.core/4.4.8 2025-10-15T00:45:42,891 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.ssh/4.4.8 2025-10-15T00:45:42,892 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.table/4.4.8 2025-10-15T00:45:42,893 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.system/org.apache.karaf.system.core/4.4.8 2025-10-15T00:45:42,894 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.web/org.apache.karaf.web.core/4.4.8 2025-10-15T00:45:42,895 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.sshd/sshd-osgi/2.15.0 2025-10-15T00:45:42,899 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.sshd/sshd-scp/2.15.0 2025-10-15T00:45:42,901 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.sshd/sshd-sftp/2.15.0 2025-10-15T00:45:42,902 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jdt/ecj/3.26.0 2025-10-15T00:45:42,907 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-client/9.4.57.v20241219 2025-10-15T00:45:42,908 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-continuation/9.4.57.v20241219 2025-10-15T00:45:42,909 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-http/9.4.57.v20241219 2025-10-15T00:45:42,910 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-io/9.4.57.v20241219 2025-10-15T00:45:42,911 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-jaas/9.4.57.v20241219 2025-10-15T00:45:42,912 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-jmx/9.4.57.v20241219 2025-10-15T00:45:42,913 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-security/9.4.57.v20241219 2025-10-15T00:45:42,914 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-server/9.4.57.v20241219 2025-10-15T00:45:42,915 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-servlet/9.4.57.v20241219 2025-10-15T00:45:42,916 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-servlets/9.4.57.v20241219 2025-10-15T00:45:42,917 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-util/9.4.57.v20241219 2025-10-15T00:45:42,919 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-util-ajax/9.4.57.v20241219 2025-10-15T00:45:42,920 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-xml/9.4.57.v20241219 2025-10-15T00:45:42,920 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/hk2-api/2.6.1 2025-10-15T00:45:42,921 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2.external/aopalliance-repackaged/2.6.1 2025-10-15T00:45:42,922 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/hk2-locator/2.6.1 2025-10-15T00:45:42,923 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/osgi-resource-locator/1.0.3 2025-10-15T00:45:42,923 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/hk2-utils/2.6.1 2025-10-15T00:45:42,924 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.containers/jersey-container-servlet/2.47 2025-10-15T00:45:42,925 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.containers/jersey-container-servlet-core/2.47 2025-10-15T00:45:42,926 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.core/jersey-client/2.47 2025-10-15T00:45:42,927 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.core/jersey-common/2.47 2025-10-15T00:45:42,930 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.core/jersey-server/2.47 2025-10-15T00:45:42,933 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.inject/jersey-hk2/2.47 2025-10-15T00:45:42,934 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.media/jersey-media-sse/2.47 2025-10-15T00:45:42,935 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.jline/jline/3.21.0 2025-10-15T00:45:42,937 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.jolokia/jolokia-osgi/1.7.2 2025-10-15T00:45:42,938 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.jspecify/jspecify/1.0.0 2025-10-15T00:45:42,939 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm/9.8 2025-10-15T00:45:42,940 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-commons/9.8 2025-10-15T00:45:42,940 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-tree/9.8 2025-10-15T00:45:42,941 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-analysis/9.8 2025-10-15T00:45:42,942 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-util/9.8 2025-10-15T00:45:42,942 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-authn-api/0.21.2 2025-10-15T00:45:42,943 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-cert/0.21.2 2025-10-15T00:45:42,944 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-encrypt-service/0.21.2 2025-10-15T00:45:42,945 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-encrypt-service-impl/0.21.2 2025-10-15T00:45:42,945 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-filterchain/0.21.2 2025-10-15T00:45:42,946 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-idm-store-h2/0.21.2 2025-10-15T00:45:42,947 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-jetty-auth-log-filter/0.21.2 2025-10-15T00:45:42,948 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-password-service-api/0.21.2 2025-10-15T00:45:42,948 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-password-service-impl/0.21.2 2025-10-15T00:45:42,949 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/repackaged-shiro/0.21.2 2025-10-15T00:45:42,952 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-shiro/0.21.2 2025-10-15T00:45:42,953 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-shiro-api/0.21.2 2025-10-15T00:45:42,955 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-tokenauthrealm/0.21.2 2025-10-15T00:45:42,955 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/web-api/0.21.2 2025-10-15T00:45:42,956 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/web-osgi-impl/0.21.2 2025-10-15T00:45:42,957 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/servlet-api/0.21.2 2025-10-15T00:45:42,957 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/servlet-jersey2/0.21.2 2025-10-15T00:45:42,958 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/atomix-storage/11.0.2 2025-10-15T00:45:42,959 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/blueprint/11.0.2 2025-10-15T00:45:42,960 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-access-api/11.0.2 2025-10-15T00:45:42,961 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-access-client/11.0.2 2025-10-15T00:45:42,962 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-dom-api/11.0.2 2025-10-15T00:45:42,963 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-mgmt-api/11.0.2 2025-10-15T00:45:42,964 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/eos-dom-akka/11.0.2 2025-10-15T00:45:42,965 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/raft-api/11.0.2 2025-10-15T00:45:42,966 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/raft-journal/11.0.2 2025-10-15T00:45:42,967 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/raft-spi/11.0.2 2025-10-15T00:45:42,967 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/repackaged-pekko/11.0.2 2025-10-15T00:45:42,992 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-akka-raft/11.0.2 2025-10-15T00:45:42,993 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-akka-segmented-journal/11.0.2 2025-10-15T00:45:42,994 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-cluster-admin-api/11.0.2 2025-10-15T00:45:42,996 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-cluster-admin-impl/11.0.2 2025-10-15T00:45:42,997 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-clustering-commons/11.0.2 2025-10-15T00:45:42,998 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-common-util/11.0.2 2025-10-15T00:45:42,999 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-distributed-datastore/11.0.2 2025-10-15T00:45:43,001 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-remoterpc-connector/11.0.2 2025-10-15T00:45:43,002 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/diagstatus-api/7.1.7 2025-10-15T00:45:43,003 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/diagstatus-impl/7.1.7 2025-10-15T00:45:43,004 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/diagstatus-shell/7.1.7 2025-10-15T00:45:43,005 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/ready-api/7.1.7 2025-10-15T00:45:43,006 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/ready-impl/7.1.7 2025-10-15T00:45:43,006 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/infrautils-util/7.1.7 2025-10-15T00:45:43,007 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-dom-adapter/14.0.18 2025-10-15T00:45:43,009 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-util/14.0.18 2025-10-15T00:45:43,010 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-crypt-hash/14.0.18 2025-10-15T00:45:43,010 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-encryption-algs/14.0.18 2025-10-15T00:45:43,011 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-key-exchange-algs/14.0.18 2025-10-15T00:45:43,012 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-mac-algs/14.0.18 2025-10-15T00:45:43,012 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-ssh-public-key-algs/14.0.18 2025-10-15T00:45:43,013 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.iana/iana-tls-cipher-suite-algs/14.0.18 2025-10-15T00:45:43,014 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6241/14.0.18 2025-10-15T00:45:43,016 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6243/14.0.18 2025-10-15T00:45:43,017 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6470/14.0.18 2025-10-15T00:45:43,018 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6991-ietf-inet-types/14.0.18 2025-10-15T00:45:43,019 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc6991-ietf-yang-types/14.0.18 2025-10-15T00:45:43,019 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc7407-ietf-x509-cert-to-name/14.0.18 2025-10-15T00:45:43,020 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc7952/14.0.18 2025-10-15T00:45:43,021 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8040-ietf-restconf/14.0.18 2025-10-15T00:45:43,022 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8040-ietf-restconf-monitoring/14.0.18 2025-10-15T00:45:43,023 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8072/14.0.18 2025-10-15T00:45:43,024 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8341/14.0.18 2025-10-15T00:45:43,025 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8342-ietf-datastores/14.0.18 2025-10-15T00:45:43,026 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8342-ietf-origin/14.0.18 2025-10-15T00:45:43,026 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8343/14.0.18 2025-10-15T00:45:43,027 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8344/14.0.18 2025-10-15T00:45:43,029 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8525/14.0.18 2025-10-15T00:45:43,030 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8526/14.0.18 2025-10-15T00:45:43,032 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8528/14.0.18 2025-10-15T00:45:43,033 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8529/14.0.18 2025-10-15T00:45:43,034 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8639/14.0.18 2025-10-15T00:45:43,036 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc8650/14.0.18 2025-10-15T00:45:43,037 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9640/14.0.18 2025-10-15T00:45:43,513 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9641/14.0.18 2025-10-15T00:45:43,518 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9642/14.0.18 2025-10-15T00:45:43,522 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9643-ietf-tcp-client/14.0.18 2025-10-15T00:45:43,524 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9643-ietf-tcp-common/14.0.18 2025-10-15T00:45:43,525 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9643-ietf-tcp-server/14.0.18 2025-10-15T00:45:43,526 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9644-ietf-ssh-client/14.0.18 2025-10-15T00:45:43,527 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9644-ietf-ssh-common/14.0.18 2025-10-15T00:45:43,532 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9644-ietf-ssh-server/14.0.18 2025-10-15T00:45:43,535 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9645-ietf-tls-client/14.0.18 2025-10-15T00:45:43,538 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9645-ietf-tls-common/14.0.18 2025-10-15T00:45:43,541 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.binding.model.ietf/rfc9645-ietf-tls-server/14.0.18 2025-10-15T00:45:43,543 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-binding-adapter/14.0.18 2025-10-15T00:45:43,543 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-api/14.0.18 2025-10-15T00:45:43,544 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-spi/14.0.18 2025-10-15T00:45:43,545 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-common-api/14.0.18 2025-10-15T00:45:43,546 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-api/14.0.18 2025-10-15T00:45:43,547 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-broker/14.0.18 2025-10-15T00:45:43,548 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-schema-osgi/14.0.18 2025-10-15T00:45:43,549 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-spi/14.0.18 2025-10-15T00:45:43,550 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-binding-api/14.0.18 2025-10-15T00:45:43,551 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-common-api/14.0.18 2025-10-15T00:45:43,551 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-dom-api/14.0.18 2025-10-15T00:45:43,552 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-singleton-api/14.0.18 2025-10-15T00:45:43,553 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-singleton-impl/14.0.18 2025-10-15T00:45:43,553 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/general-entity/14.0.18 2025-10-15T00:45:43,554 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/ietf-topology/2013.10.21.26.18 2025-10-15T00:45:43,555 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/ietf-type-util/14.0.18 2025-10-15T00:45:43,556 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/opendaylight-l2-types/2013.08.27.26.18 2025-10-15T00:45:43,557 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/yang-ext/2013.09.07.26.18 2025-10-15T00:45:43,557 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/databind/9.0.1 2025-10-15T00:45:43,558 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/netconf-dom-api/9.0.1 2025-10-15T00:45:43,559 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/keystore-api/9.0.1 2025-10-15T00:45:43,559 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/keystore-none/9.0.1 2025-10-15T00:45:43,560 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf.model/draft-ietf-restconf-server/9.0.1 2025-10-15T00:45:43,562 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf.model/rfc5277/9.0.1 2025-10-15T00:45:43,563 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf.model/sal-remote/9.0.1 2025-10-15T00:45:43,564 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/netconf-api/9.0.1 2025-10-15T00:45:43,565 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/netconf-common-mdsal/9.0.1 2025-10-15T00:45:43,565 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/odl-device-notification/9.0.1 2025-10-15T00:45:43,566 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-api/9.0.1 2025-10-15T00:45:43,567 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-mdsal-spi/9.0.1 2025-10-15T00:45:43,568 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-nb/9.0.1 2025-10-15T00:45:43,569 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server/9.0.1 2025-10-15T00:45:43,570 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-api/9.0.1 2025-10-15T00:45:43,571 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-jaxrs/9.0.1 2025-10-15T00:45:43,572 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-mdsal/9.0.1 2025-10-15T00:45:43,573 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-spi/9.0.1 2025-10-15T00:45:43,574 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-subscription/9.0.1 2025-10-15T00:45:43,575 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/sal-remote-impl/9.0.1 2025-10-15T00:45:43,576 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/shaded-sshd/9.0.1 2025-10-15T00:45:43,582 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-api/9.0.1 2025-10-15T00:45:43,583 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-http/9.0.1 2025-10-15T00:45:43,586 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-ssh/9.0.1 2025-10-15T00:45:43,588 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-tcp/9.0.1 2025-10-15T00:45:43,589 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-tls/9.0.1 2025-10-15T00:45:43,590 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/truststore-api/9.0.1 2025-10-15T00:45:43,590 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/truststore-none/9.0.1 2025-10-15T00:45:43,591 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/yanglib-mdsal-writer/9.0.1 2025-10-15T00:45:43,592 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.odlparent/bundles-diag/14.1.3 2025-10-15T00:45:43,592 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin/0.20.1 2025-10-15T00:45:43,596 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-api/0.20.1 2025-10-15T00:45:43,597 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/arbitratorreconciliation-api/0.20.1 2025-10-15T00:45:43,598 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/arbitratorreconciliation-impl/0.20.1 2025-10-15T00:45:43,599 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/bulk-o-matic/0.20.1 2025-10-15T00:45:43,600 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/device-ownership-service/0.20.1 2025-10-15T00:45:43,601 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/forwardingrules-manager/0.20.1 2025-10-15T00:45:43,603 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/lldp-speaker/0.20.1 2025-10-15T00:45:43,603 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/of-switch-config-pusher/0.20.1 2025-10-15T00:45:43,604 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/reconciliation-framework/0.20.1 2025-10-15T00:45:43,605 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/topology-lldp-discovery/0.20.1 2025-10-15T00:45:43,606 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/topology-manager/0.20.1 2025-10-15T00:45:43,607 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-blueprint-config/0.20.1 2025-10-15T00:45:43,607 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-common/0.20.1 2025-10-15T00:45:43,608 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-extension-api/0.20.1 2025-10-15T00:45:43,612 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-extension-onf/0.20.1 2025-10-15T00:45:43,613 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-impl/0.20.1 2025-10-15T00:45:43,618 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.libraries/liblldp/0.20.1 2025-10-15T00:45:43,619 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-flow-base/0.20.1 2025-10-15T00:45:43,626 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-flow-service/0.20.1 2025-10-15T00:45:43,632 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-flow-statistics/0.20.1 2025-10-15T00:45:43,635 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-inventory/0.20.1 2025-10-15T00:45:43,636 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-topology/0.20.1 2025-10-15T00:45:43,648 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflowjava-blueprint-config/0.20.1 2025-10-15T00:45:43,649 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-api/0.20.1 2025-10-15T00:45:43,659 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-impl/0.20.1 2025-10-15T00:45:43,663 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-spi/0.20.1 2025-10-15T00:45:43,663 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflowjava-util/0.20.1 2025-10-15T00:45:43,664 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/srm-api/0.20.1 2025-10-15T00:45:43,665 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/srm-impl/0.20.1 2025-10-15T00:45:43,666 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/srm-shell/0.20.1 2025-10-15T00:45:43,667 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-api/14.0.17 2025-10-15T00:45:43,667 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-dynamic/14.0.17 2025-10-15T00:45:43,668 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-osgi/14.0.17 2025-10-15T00:45:43,669 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-spi/14.0.17 2025-10-15T00:45:43,670 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-generator/14.0.17 2025-10-15T00:45:43,671 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-loader/14.0.17 2025-10-15T00:45:43,672 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-model/14.0.17 2025-10-15T00:45:43,673 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-reflect/14.0.17 2025-10-15T00:45:43,673 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-runtime-api/14.0.17 2025-10-15T00:45:43,674 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-runtime-osgi/14.0.17 2025-10-15T00:45:43,675 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-runtime-spi/14.0.17 2025-10-15T00:45:43,676 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-spec/14.0.17 2025-10-15T00:45:43,677 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/codegen-extensions/14.0.17 2025-10-15T00:45:43,677 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/concepts/14.0.17 2025-10-15T00:45:43,678 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/odlext-model-api/14.0.17 2025-10-15T00:45:43,678 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/odlext-parser-support/14.0.17 2025-10-15T00:45:43,679 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/openconfig-model-api/14.0.17 2025-10-15T00:45:43,680 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/openconfig-parser-support/14.0.17 2025-10-15T00:45:43,680 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6241-model-api/14.0.17 2025-10-15T00:45:43,681 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6241-parser-support/14.0.17 2025-10-15T00:45:43,682 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6536-model-api/14.0.17 2025-10-15T00:45:43,682 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6536-parser-support/14.0.17 2025-10-15T00:45:43,683 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6643-model-api/14.0.17 2025-10-15T00:45:43,683 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6643-parser-support/14.0.17 2025-10-15T00:45:43,684 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc7952-model-api/14.0.17 2025-10-15T00:45:43,685 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc7952-parser-support/14.0.17 2025-10-15T00:45:43,685 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8040-model-api/14.0.17 2025-10-15T00:45:43,686 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8040-parser-support/14.0.17 2025-10-15T00:45:43,687 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8528-model-api/14.0.17 2025-10-15T00:45:43,687 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8528-parser-support/14.0.17 2025-10-15T00:45:43,688 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8639-model-api/14.0.17 2025-10-15T00:45:43,688 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8639-parser-support/14.0.17 2025-10-15T00:45:43,689 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8819-model-api/14.0.17 2025-10-15T00:45:43,689 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8819-parser-support/14.0.17 2025-10-15T00:45:43,690 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/util/14.0.17 2025-10-15T00:45:43,691 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-common/14.0.17 2025-10-15T00:45:43,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-common-netty/14.0.17 2025-10-15T00:45:43,692 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-api/14.0.17 2025-10-15T00:45:43,693 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-codec-binfmt/14.0.17 2025-10-15T00:45:43,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-codec-gson/14.0.17 2025-10-15T00:45:43,695 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-codec-xml/14.0.17 2025-10-15T00:45:43,696 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-impl/14.0.17 2025-10-15T00:45:43,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-spi/14.0.17 2025-10-15T00:45:43,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-transform/14.0.17 2025-10-15T00:45:43,698 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-tree-api/14.0.17 2025-10-15T00:45:43,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-tree-ri/14.0.17 2025-10-15T00:45:43,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-tree-spi/14.0.17 2025-10-15T00:45:43,700 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-util/14.0.17 2025-10-15T00:45:43,701 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-ir/14.0.17 2025-10-15T00:45:43,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-api/14.0.17 2025-10-15T00:45:43,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-export/14.0.17 2025-10-15T00:45:43,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-ri/14.0.17 2025-10-15T00:45:43,705 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-spi/14.0.17 2025-10-15T00:45:43,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-util/14.0.17 2025-10-15T00:45:43,706 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-api/14.0.17 2025-10-15T00:45:43,707 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-impl/14.0.17 2025-10-15T00:45:43,708 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-reactor/14.0.17 2025-10-15T00:45:43,709 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-rfc7950/14.0.17 2025-10-15T00:45:43,729 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-spi/14.0.17 2025-10-15T00:45:43,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-repo-api/14.0.17 2025-10-15T00:45:43,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-repo-fs/14.0.17 2025-10-15T00:45:43,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-repo-spi/14.0.17 2025-10-15T00:45:43,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-xpath-api/14.0.17 2025-10-15T00:45:43,733 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-xpath-impl/14.0.17 2025-10-15T00:45:43,734 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.url/pax-url-war/2.6.17/jar/uber 2025-10-15T00:45:43,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-api/8.0.33 2025-10-15T00:45:43,740 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-compatibility-el2/8.0.33 2025-10-15T00:45:43,740 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-compatibility-servlet31/8.0.33 2025-10-15T00:45:43,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-extender-war/8.0.33 2025-10-15T00:45:43,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-extender-whiteboard/8.0.33 2025-10-15T00:45:43,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-jetty/8.0.33 2025-10-15T00:45:43,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-jsp/8.0.33 2025-10-15T00:45:43,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-runtime/8.0.33 2025-10-15T00:45:43,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-spi/8.0.33 2025-10-15T00:45:43,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-tomcat-common/8.0.33 2025-10-15T00:45:43,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-websocket/8.0.33 2025-10-15T00:45:43,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.osgi/org.osgi.service.component/1.5.1 2025-10-15T00:45:43,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.owasp.encoder/encoder/1.3.1 2025-10-15T00:45:43,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.scala-lang/scala-library/2.13.16 2025-10-15T00:45:43,760 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.reactivestreams/reactive-streams/1.0.4 2025-10-15T00:45:43,761 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.codehaus.woodstox/stax2-api/4.2.2 2025-10-15T00:45:43,762 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:tech.pantheon.triemap/triemap/1.3.2 2025-10-15T00:45:43,762 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | wrap:mvn:net.java.dev.stax-utils/stax-utils/20070216 2025-10-15T00:45:43,763 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | wrap:mvn:org.lmdbjava/lmdbjava/0.7.0 2025-10-15T00:45:43,778 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/opendaylight/datastore/initial/config/aaa-password-service-config.xml 2025-10-15T00:45:43,779 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/configuration/factory/pekko.conf 2025-10-15T00:45:43,779 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/org.opendaylight.controller.cluster.datastore.cfg 2025-10-15T00:45:43,784 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0//etc/org.jolokia.osgi.cfg 2025-10-15T00:45:43,784 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/org.opendaylight.openflowplugin.cfg 2025-10-15T00:45:43,785 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/opendaylight/datastore/initial/config/default-openflow-connection-config.xml 2025-10-15T00:45:43,785 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/opendaylight/datastore/initial/config/legacy-openflow-connection-config.xml 2025-10-15T00:45:43,785 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/opendaylight/datastore/initial/config/aaa-app-config.xml 2025-10-15T00:45:43,786 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/opendaylight/datastore/initial/config/aaa-datastore-config.xml 2025-10-15T00:45:43,786 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/bin/idmtool 2025-10-15T00:45:43,786 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0//etc/org.opendaylight.aaa.filterchain.cfg 2025-10-15T00:45:43,787 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/opendaylight/datastore/initial/config/aaa-cert-config.xml 2025-10-15T00:45:43,787 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/jetty-web.xml 2025-10-15T00:45:43,789 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.0/etc/org.opendaylight.restconf.nb.rfc8040.cfg 2025-10-15T00:45:43,791 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Refreshing bundles: 2025-10-15T00:45:43,792 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.el-api/3.0.3 (Attached fragments changed: [org.ops4j.pax.web.pax-web-compatibility-el2/8.0.33]) 2025-10-15T00:45:43,792 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.enterprise.cdi-api/2.0.0.SP1 (Wired to javax.el-api/3.0.3 which is being refreshed) 2025-10-15T00:45:43,792 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.transaction-api/1.2.0 (Wired to javax.enterprise.cdi-api/2.0.0.SP1 which is being refreshed) 2025-10-15T00:45:43,792 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.jasypt/1.9.3.1 (Should be wired to: jakarta.servlet-api/4.0.0 (through [org.apache.servicemix.bundles.jasypt/1.9.3.1] osgi.wiring.package; resolution:=optional; filter:="(osgi.wiring.package=javax.servlet)")) 2025-10-15T00:45:43,792 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 (Bundle will be uninstalled) 2025-10-15T00:45:43,792 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc.config/1.5.7 (Wired to org.apache.servicemix.bundles.jasypt/1.9.3.1 which is being refreshed) 2025-10-15T00:45:43,792 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc.pool.common/1.5.7 (Wired to javax.transaction-api/1.2.0 which is being refreshed) 2025-10-15T00:45:44,384 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Starting bundles: 2025-10-15T00:45:44,385 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.objectweb.asm/9.8.0 2025-10-15T00:45:44,388 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.objectweb.asm.tree/9.8.0 2025-10-15T00:45:44,388 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.objectweb.asm.commons/9.8.0 2025-10-15T00:45:44,389 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.blueprint.api/1.0.1 2025-10-15T00:45:44,389 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.objectweb.asm.tree.analysis/9.8.0 2025-10-15T00:45:44,389 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.quiesce.api/1.0.0 2025-10-15T00:45:44,390 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.objectweb.asm.util/9.8.0 2025-10-15T00:45:44,390 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.proxy/1.1.14 2025-10-15T00:45:44,431 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.blueprint.core/1.10.3 2025-10-15T00:45:44,683 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.aries.blueprint.core/1.10.3 has been started 2025-10-15T00:45:44,686 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.blueprint.cm/1.3.2 2025-10-15T00:45:45,080 | INFO | fileinstall-/tmp/karaf-0.23.0/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.opendaylight.restconf.nb.rfc8040} from /tmp/karaf-0.23.0/etc/org.opendaylight.restconf.nb.rfc8040.cfg 2025-10-15T00:45:45,082 | INFO | fileinstall-/tmp/karaf-0.23.0/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.jolokia.osgi} from /tmp/karaf-0.23.0/etc/org.jolokia.osgi.cfg 2025-10-15T00:45:45,084 | INFO | fileinstall-/tmp/karaf-0.23.0/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.opendaylight.controller.cluster.datastore} from /tmp/karaf-0.23.0/etc/org.opendaylight.controller.cluster.datastore.cfg 2025-10-15T00:45:45,085 | INFO | fileinstall-/tmp/karaf-0.23.0/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.opendaylight.openflowplugin} from /tmp/karaf-0.23.0/etc/org.opendaylight.openflowplugin.cfg 2025-10-15T00:45:45,086 | INFO | fileinstall-/tmp/karaf-0.23.0/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.opendaylight.aaa.filterchain} from /tmp/karaf-0.23.0/etc/org.opendaylight.aaa.filterchain.cfg 2025-10-15T00:45:45,114 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.aries.blueprint.cm/1.3.2 has been started 2025-10-15T00:45:45,115 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.deployer.blueprint/4.4.8 2025-10-15T00:45:45,119 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.deployer.kar/4.4.8 2025-10-15T00:45:45,123 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.deployer.wrap/4.4.8 2025-10-15T00:45:45,127 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.deployer.features/4.4.8 2025-10-15T00:45:45,145 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.sshd.osgi/2.15.0 2025-10-15T00:45:45,145 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.sshd.scp/2.15.0 2025-10-15T00:45:45,146 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.sshd.sftp/2.15.0 2025-10-15T00:45:45,146 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.jline/3.21.0 2025-10-15T00:45:45,146 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.shell.core/4.4.8 2025-10-15T00:45:45,179 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.deployer.kar/4.4.8 2025-10-15T00:45:45,181 | INFO | features-3-thread-1 | Activator | 120 - org.apache.karaf.shell.core - 4.4.8 | Not starting local console. To activate set karaf.startLocalConsole=true 2025-10-15T00:45:45,199 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.karaf.shell.core/4.4.8 has been started 2025-10-15T00:45:45,205 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.config.command/4.4.8 2025-10-15T00:45:45,223 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.config.command/4.4.8 2025-10-15T00:45:45,293 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | jakarta.servlet-api/4.0.0 2025-10-15T00:45:45,305 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.websocket-api/1.1.2 2025-10-15T00:45:45,309 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-api/8.0.33 2025-10-15T00:45:45,310 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-spi/8.0.33 2025-10-15T00:45:45,310 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.system.core/4.4.8 2025-10-15T00:45:45,320 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.system.core/4.4.8 2025-10-15T00:45:45,320 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.features.command/4.4.8 2025-10-15T00:45:45,330 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.features.command/4.4.8 2025-10-15T00:45:45,331 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.instance.core/4.4.8 2025-10-15T00:45:45,348 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.instance.core/4.4.8 2025-10-15T00:45:45,349 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.el-api/3.0.3 2025-10-15T00:45:45,350 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-tomcat-common/8.0.33 2025-10-15T00:45:45,351 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jdt.core.compiler.batch/3.26.0.v20210609-0549 2025-10-15T00:45:45,352 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-jsp/8.0.33 2025-10-15T00:45:45,352 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.jaas.config/4.4.8 2025-10-15T00:45:45,357 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.osgi.service.component/1.5.1.202212101352 2025-10-15T00:45:45,358 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.util/9.4.57.v20241219 2025-10-15T00:45:45,358 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.jmx/9.4.57.v20241219 2025-10-15T00:45:45,359 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.io/9.4.57.v20241219 2025-10-15T00:45:45,359 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.http/9.4.57.v20241219 2025-10-15T00:45:45,359 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.client/9.4.57.v20241219 2025-10-15T00:45:45,359 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-extender-war/8.0.33 2025-10-15T00:45:45,366 | INFO | features-3-thread-1 | Activator | 392 - org.ops4j.pax.web.pax-web-extender-war - 8.0.33 | Configuring WAR extender thread pool. Pool size = 3 2025-10-15T00:45:45,467 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.felix.scr/2.2.6 2025-10-15T00:45:45,473 | INFO | features-3-thread-1 | ROOT | 93 - org.apache.felix.scr - 2.2.6 | bundle org.apache.felix.scr:2.2.6 (93) Starting with globalExtender setting: false 2025-10-15T00:45:45,475 | INFO | features-3-thread-1 | ROOT | 93 - org.apache.felix.scr - 2.2.6 | bundle org.apache.felix.scr:2.2.6 (93) Version = 2.2.6 2025-10-15T00:45:45,483 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-websocket/8.0.33 2025-10-15T00:45:45,484 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.service.core/4.4.8 2025-10-15T00:45:45,489 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.service.core/4.4.8 2025-10-15T00:45:45,490 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.server/9.4.57.v20241219 2025-10-15T00:45:45,500 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.security/9.4.57.v20241219 2025-10-15T00:45:45,500 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.bundle.core/4.4.8 2025-10-15T00:45:45,521 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.bundle.core/4.4.8 2025-10-15T00:45:45,521 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.util.ajax/9.4.57.v20241219 2025-10-15T00:45:45,522 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.servlet/9.4.57.v20241219 2025-10-15T00:45:45,522 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.xml/9.4.57.v20241219 2025-10-15T00:45:45,522 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.jaas/9.4.57.v20241219 2025-10-15T00:45:45,523 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.servlets/9.4.57.v20241219 2025-10-15T00:45:45,523 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-jetty/8.0.33 2025-10-15T00:45:45,531 | INFO | features-3-thread-1 | log | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Logging initialized @12944ms to org.eclipse.jetty.util.log.Slf4jLog 2025-10-15T00:45:45,538 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-runtime/8.0.33 2025-10-15T00:45:45,559 | INFO | features-3-thread-1 | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | EventAdmin support enabled, WAB events will be posted to EventAdmin topics. 2025-10-15T00:45:45,559 | INFO | features-3-thread-1 | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Pax Web Runtime started 2025-10-15T00:45:45,559 | INFO | CM Configuration Updater (ManagedService Update: pid=[org.ops4j.pax.web]) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Scheduling Pax Web reconfiguration because configuration has changed 2025-10-15T00:45:45,560 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.http.core/4.4.8 2025-10-15T00:45:45,564 | INFO | paxweb-config-3-thread-1 (change config) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Scheduling Pax Web reconfiguration because ServerControllerFactory has been registered 2025-10-15T00:45:45,572 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.http.core/4.4.8. Missing service: [org.apache.karaf.http.core.ProxyService] 2025-10-15T00:45:45,572 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-extender-whiteboard/8.0.33 2025-10-15T00:45:45,573 | INFO | features-3-thread-1 | Activator | 393 - org.ops4j.pax.web.pax-web-extender-whiteboard - 8.0.33 | Starting Pax Web Whiteboard Extender 2025-10-15T00:45:45,588 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.jaas.modules/4.4.8 2025-10-15T00:45:45,595 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.jaas.command/4.4.8 2025-10-15T00:45:45,606 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.jaas.command/4.4.8 2025-10-15T00:45:45,607 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Updating commands for bundle org.apache.karaf.jaas.command/4.4.8 2025-10-15T00:45:45,608 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Updating commands for bundle org.apache.karaf.jaas.command/4.4.8 2025-10-15T00:45:45,608 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.package.core/4.4.8 2025-10-15T00:45:45,614 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.package.core/4.4.8 2025-10-15T00:45:45,614 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.log.core/4.4.8 2025-10-15T00:45:45,640 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.log.core/4.4.8. Missing service: [org.apache.karaf.log.core.LogService] 2025-10-15T00:45:45,640 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Configuring server controller org.ops4j.pax.web.service.jetty.internal.JettyServerController 2025-10-15T00:45:45,640 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.jmx.blueprint.api/1.2.0 2025-10-15T00:45:45,640 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Configuring JettyServerController{configuration=b3a993db-7aab-4f9f-a847-e2e781f64177,state=UNCONFIGURED} 2025-10-15T00:45:45,640 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating Jetty server instance using configuration properties. 2025-10-15T00:45:45,643 | INFO | activator-1-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.log.core/4.4.8 2025-10-15T00:45:45,670 | INFO | activator-1-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Unregistering commands for bundle org.apache.karaf.log.core/4.4.8 2025-10-15T00:45:45,642 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.web.core/4.4.8 2025-10-15T00:45:45,671 | INFO | activator-1-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.log.core/4.4.8 2025-10-15T00:45:45,672 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Processing Jetty configuration from files: [etc/jetty.xml] 2025-10-15T00:45:45,694 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.web.core/4.4.8. Missing service: [org.apache.karaf.web.WebContainerService] 2025-10-15T00:45:45,694 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.scr.state/4.4.8 2025-10-15T00:45:45,842 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.url.war/2.6.17 2025-10-15T00:45:45,847 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.jmx.whiteboard/1.2.0 2025-10-15T00:45:45,850 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.shell.ssh/4.4.8 2025-10-15T00:45:45,880 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.shell.ssh/4.4.8. Missing service: [org.apache.sshd.server.SshServer] 2025-10-15T00:45:45,884 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.jmx.blueprint.core/1.2.0 2025-10-15T00:45:45,887 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.management.server/4.4.8 2025-10-15T00:45:45,893 | INFO | activator-1-thread-1 | Activator | 113 - org.apache.karaf.management.server - 4.4.8 | Setting java.rmi.server.hostname system property to 127.0.0.1 2025-10-15T00:45:45,900 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.scr.management/4.4.8 2025-10-15T00:45:45,903 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.jmx.api/1.1.5 2025-10-15T00:45:45,904 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.shell.table/4.4.8 2025-10-15T00:45:45,904 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.continuation/9.4.57.v20241219 2025-10-15T00:45:45,905 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.bundle.blueprintstate/4.4.8 2025-10-15T00:45:45,918 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Found configured connector "jetty-default": 0.0.0.0:8181 2025-10-15T00:45:45,919 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Using configured jetty-default@501bd737{HTTP/1.1, (http/1.1)}{0.0.0.0:8181} as non secure connector for address: 0.0.0.0:8181 2025-10-15T00:45:45,919 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Eagerly starting Jetty thread pool QueuedThreadPool[qtp1808674182]@6bce2d86{STOPPED,0<=0<=200,i=0,r=-1,q=0}[NO_TRY] 2025-10-15T00:45:45,943 | INFO | paxweb-config-3-thread-1 (change controller) | JettyFactory | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding JMX support to Jetty server 2025-10-15T00:45:45,965 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.shell.commands/4.4.8 2025-10-15T00:45:45,982 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.shell.commands/4.4.8 2025-10-15T00:45:45,983 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Updating commands for bundle org.apache.karaf.shell.commands/4.4.8 2025-10-15T00:45:45,986 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Starting server controller org.ops4j.pax.web.service.jetty.internal.JettyServerController 2025-10-15T00:45:45,986 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting JettyServerController{configuration=b3a993db-7aab-4f9f-a847-e2e781f64177,state=STOPPED} 2025-10-15T00:45:45,986 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Server@3952dd85{STOPPED}[9.4.57.v20241219] 2025-10-15T00:45:45,987 | INFO | paxweb-config-3-thread-1 (change controller) | Server | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | jetty-9.4.57.v20241219; built: 2025-01-08T21:24:30.412Z; git: df524e6b29271c2e09ba9aea83c18dc9db464a31; jvm 21.0.8+9-Ubuntu-0ubuntu122.04.1 2025-10-15T00:45:45,989 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.diagnostic.core/4.4.8 2025-10-15T00:45:45,997 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.diagnostic.core/4.4.8 2025-10-15T00:45:45,997 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.kar.core/4.4.8 2025-10-15T00:45:46,009 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.kar.core/4.4.8. Missing service: [org.apache.karaf.kar.KarService] 2025-10-15T00:45:46,009 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.jmx.core/1.1.8 2025-10-15T00:45:46,014 | INFO | features-3-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Starting JMX OSGi agent 2025-10-15T00:45:46,026 | INFO | activator-1-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.kar.core/4.4.8 2025-10-15T00:45:46,028 | INFO | paxweb-config-3-thread-1 (change controller) | session | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | DefaultSessionIdManager workerName=node0 2025-10-15T00:45:46,028 | INFO | paxweb-config-3-thread-1 (change controller) | session | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | No SessionScavenger set, using defaults 2025-10-15T00:45:46,029 | INFO | features-3-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering MBean with ObjectName [osgi.core:service=permissionadmin,version=1.2,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76] for service with service.id [15] 2025-10-15T00:45:46,029 | INFO | paxweb-config-3-thread-1 (change controller) | session | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | node0 Scavenging every 600000ms 2025-10-15T00:45:46,031 | INFO | features-3-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering MBean with ObjectName [osgi.compendium:service=cm,version=1.3,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76] for service with service.id [39] 2025-10-15T00:45:46,058 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.blueprint/11.0.2 2025-10-15T00:45:46,059 | INFO | features-3-thread-1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Starting BlueprintBundleTracker 2025-10-15T00:45:46,075 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.apache.karaf.shell.core_4.4.8 [120] was successfully created 2025-10-15T00:45:46,075 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.apache.aries.blueprint.cm_1.3.2 [78] was successfully created 2025-10-15T00:45:46,076 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.apache.aries.blueprint.core_1.10.3 [79] was successfully created 2025-10-15T00:45:46,095 | INFO | activator-1-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.shell.ssh/4.4.8 2025-10-15T00:45:46,138 | INFO | activator-1-thread-1 | DefaultIoServiceFactoryFactory | 125 - org.apache.sshd.osgi - 2.15.0 | No detected/configured IoServiceFactoryFactory; using Nio2ServiceFactoryFactory 2025-10-15T00:45:46,212 | INFO | paxweb-config-3-thread-1 (change controller) | AbstractConnector | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started jetty-default@501bd737{HTTP/1.1, (http/1.1)}{0.0.0.0:8181} 2025-10-15T00:45:46,213 | INFO | paxweb-config-3-thread-1 (change controller) | Server | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started @13628ms 2025-10-15T00:45:46,215 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering HttpService factory 2025-10-15T00:45:46,224 | INFO | paxweb-config-3-thread-1 (change controller) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.apache.karaf.http.core_4.4.8 [105]] 2025-10-15T00:45:46,228 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.google.guava.failureaccess/1.0.3 2025-10-15T00:45:46,239 | INFO | HttpService->WarExtender (add HttpService) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.ops4j.pax.web.pax-web-extender-war_8.0.33 [392]] 2025-10-15T00:45:46,944 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | jakarta.annotation-api/1.3.5 2025-10-15T00:45:46,945 | INFO | paxweb-config-3-thread-1 (change controller) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.apache.karaf.web.core_4.4.8 [124]] 2025-10-15T00:45:46,945 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.google.guava/33.4.8.jre 2025-10-15T00:45:46,946 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering HttpServiceRuntime 2025-10-15T00:45:46,947 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.concepts/14.0.17 2025-10-15T00:45:46,948 | INFO | HttpService->Whiteboard (add HttpService) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.ops4j.pax.web.pax-web-extender-whiteboard_8.0.33 [393]] 2025-10-15T00:45:46,953 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | triemap/1.3.2 2025-10-15T00:45:46,953 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.util/14.0.17 2025-10-15T00:45:46,953 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-common/14.0.17 2025-10-15T00:45:46,954 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-common-api/14.0.18 2025-10-15T00:45:46,954 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-spec/14.0.17 2025-10-15T00:45:46,954 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8342-ietf-datastores/14.0.18 2025-10-15T00:45:46,955 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-xpath-api/14.0.17 2025-10-15T00:45:46,955 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-model-api/14.0.17 2025-10-15T00:45:46,956 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-api/14.0.17 2025-10-15T00:45:46,956 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-spi/14.0.17 2025-10-15T00:45:46,956 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8528-model-api/14.0.17 2025-10-15T00:45:46,957 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8040-model-api/14.0.17 2025-10-15T00:45:46,957 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc7952-model-api/14.0.17 2025-10-15T00:45:46,957 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-ir/14.0.17 2025-10-15T00:45:46,958 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-model-spi/14.0.17 2025-10-15T00:45:46,958 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-model-util/14.0.17 2025-10-15T00:45:46,958 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-util/14.0.17 2025-10-15T00:45:46,960 | INFO | activator-1-thread-3 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.http.core/4.4.8 2025-10-15T00:45:46,962 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-impl/14.0.17 2025-10-15T00:45:46,962 | INFO | activator-1-thread-2 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.web.core/4.4.8 2025-10-15T00:45:46,964 | INFO | paxweb-config-3-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-3,contextPath='/'} 2025-10-15T00:45:46,964 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.servlet-api/3.1.0 2025-10-15T00:45:46,965 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-1,name='default',path='/',bundle=org.ops4j.pax.web.pax-web-extender-whiteboard,context=(supplier)}", size=2} 2025-10-15T00:45:46,965 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-3,contextPath='/'} 2025-10-15T00:45:46,965 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.commons.collections/3.2.2 2025-10-15T00:45:46,966 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.commons.commons-beanutils/1.11.0 2025-10-15T00:45:46,966 | INFO | activator-1-thread-1 | ServiceComponentRuntimeMBeanImpl | 115 - org.apache.karaf.scr.management - 4.4.8 | Activating the Apache Karaf ServiceComponentRuntime MBean 2025-10-15T00:45:46,966 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.owasp.encoder/1.3.1 2025-10-15T00:45:46,967 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.service.cm.ConfigurationAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@113018a8 with name osgi.compendium:service=cm,version=1.3,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:46,967 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.wiring.BundleWiringStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@113018a8 with name osgi.core:type=wiringState,version=1.1,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:46,967 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.repackaged-shiro/0.21.2 2025-10-15T00:45:46,968 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.ServiceStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@113018a8 with name osgi.core:type=serviceState,version=1.7,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:46,968 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.model.ietf-type-util/14.0.18 2025-10-15T00:45:46,968 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-reflect/14.0.17 2025-10-15T00:45:46,969 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc6991-ietf-inet-types/14.0.18 2025-10-15T00:45:46,969 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.PackageStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@113018a8 with name osgi.core:type=packageState,version=1.5,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:46,969 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.FrameworkMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@113018a8 with name osgi.core:type=framework,version=1.7,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:46,969 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc6991-ietf-yang-types/14.0.18 2025-10-15T00:45:46,969 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.service.permissionadmin.PermissionAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@113018a8 with name osgi.core:service=permissionadmin,version=1.2,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:46,969 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.BundleStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@113018a8 with name osgi.core:type=bundleState,version=1.7,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:46,969 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.codegen-extensions/14.0.17 2025-10-15T00:45:46,970 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.openflowjava.openflow-protocol-api/0.20.1 2025-10-15T00:45:46,971 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.model.yang-ext/2013.9.7.26_18 2025-10-15T00:45:46,971 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.model.inventory/0.20.1 2025-10-15T00:45:46,971 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.model.opendaylight-l2-types/2013.8.27.26_18 2025-10-15T00:45:46,972 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.model.flow-base/0.20.1 2025-10-15T00:45:46,973 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.model.flow-service/0.20.1 2025-10-15T00:45:46,973 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.model.flow-statistics/0.20.1 2025-10-15T00:45:46,974 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.typesafe.config/1.4.3 2025-10-15T00:45:46,974 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.scala-lang.scala-library/2.13.16.v20250107-233423-VFINAL-3f6bdae 2025-10-15T00:45:46,976 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.typesafe.sslconfig/0.6.1 2025-10-15T00:45:46,983 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.agrona.core/1.15.2 2025-10-15T00:45:46,983 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.aeron.client/1.38.1 2025-10-15T00:45:46,983 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.aeron.driver/1.38.1 2025-10-15T00:45:46,984 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | wrap_file__tmp_karaf-0.23.0_system_org_lmdbjava_lmdbjava_0.7.0_lmdbjava-0.7.0.jar/0.0.0 2025-10-15T00:45:46,985 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | reactive-streams/1.0.4 2025-10-15T00:45:46,985 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.repackaged-pekko/11.0.2 2025-10-15T00:45:46,987 | INFO | activator-1-thread-1 | ServiceComponentRuntimeMBeanImpl | 115 - org.apache.karaf.scr.management - 4.4.8 | Deactivating the Apache Karaf ServiceComponentRuntime MBean 2025-10-15T00:45:46,988 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Unregistering org.osgi.jmx.service.cm.ConfigurationAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@113018a8 with name osgi.compendium:service=cm,version=1.3,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:46,988 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Unregistering org.osgi.jmx.framework.wiring.BundleWiringStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@113018a8 with name osgi.core:type=wiringState,version=1.1,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:46,988 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Unregistering org.osgi.jmx.framework.ServiceStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@113018a8 with name osgi.core:type=serviceState,version=1.7,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:46,989 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Unregistering org.osgi.jmx.framework.PackageStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@113018a8 with name osgi.core:type=packageState,version=1.5,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:46,989 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Unregistering org.osgi.jmx.framework.FrameworkMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@113018a8 with name osgi.core:type=framework,version=1.7,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:46,989 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Unregistering org.osgi.jmx.service.permissionadmin.PermissionAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@113018a8 with name osgi.core:service=permissionadmin,version=1.2,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:46,989 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Unregistering org.osgi.jmx.framework.BundleStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@113018a8 with name osgi.core:type=bundleState,version=1.7,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:46,989 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | net.bytebuddy.byte-buddy/1.17.7 2025-10-15T00:45:46,989 | WARN | Framework Event Dispatcher: Equinox Container: 9c53ab35-3fc9-4fa4-a88c-87952efc5a76 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Task rejected for JMX Notification dispatch of event [org.osgi.framework.BundleEvent[source=org.opendaylight.controller.repackaged-pekko_11.0.2 [189]]] - Dispatcher may have been shutdown 2025-10-15T00:45:46,989 | WARN | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Task rejected for JMX Notification dispatch of event [org.osgi.framework.ServiceEvent[source={javax.management.MBeanServer}={service.id=160, service.bundleid=113, service.scope=singleton}]] - Dispatcher may have been shutdown 2025-10-15T00:45:46,994 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.geronimo.specs.geronimo-atinject_1.0_spec/1.2.0 2025-10-15T00:45:46,994 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.dropwizard.metrics.core/4.2.36 2025-10-15T00:45:46,994 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.dropwizard.metrics.jmx/4.2.36 2025-10-15T00:45:46,995 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | lz4-java/1.8.0 2025-10-15T00:45:46,995 | WARN | activator-1-thread-1 | Activator | 113 - org.apache.karaf.management.server - 4.4.8 | java.rmi.server.hostname system property is already set to 127.0.0.1. Apache Karaf doesn't override it 2025-10-15T00:45:46,995 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.raft-api/11.0.2 2025-10-15T00:45:46,996 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.raft-spi/11.0.2 2025-10-15T00:45:46,997 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-tree-api/14.0.17 2025-10-15T00:45:46,997 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-tree-spi/14.0.17 2025-10-15T00:45:46,997 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-codec-binfmt/14.0.17 2025-10-15T00:45:46,998 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-repo-api/14.0.17 2025-10-15T00:45:46,998 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-repo-spi/14.0.17 2025-10-15T00:45:46,998 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8343/14.0.18 2025-10-15T00:45:46,999 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8341/14.0.18 2025-10-15T00:45:46,999 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8344/14.0.18 2025-10-15T00:45:46,999 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8528/14.0.18 2025-10-15T00:45:47,000 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8529/14.0.18 2025-10-15T00:45:47,001 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8040-ietf-restconf/14.0.18 2025-10-15T00:45:47,028 | INFO | activator-1-thread-1 | ServiceComponentRuntimeMBeanImpl | 115 - org.apache.karaf.scr.management - 4.4.8 | Activating the Apache Karaf ServiceComponentRuntime MBean 2025-10-15T00:45:47,029 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.service.cm.ConfigurationAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@46f2122a with name osgi.compendium:service=cm,version=1.3,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:47,029 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.wiring.BundleWiringStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@46f2122a with name osgi.core:type=wiringState,version=1.1,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:47,029 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.ServiceStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@46f2122a with name osgi.core:type=serviceState,version=1.7,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:47,029 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.PackageStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@46f2122a with name osgi.core:type=packageState,version=1.5,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:47,029 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.FrameworkMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@46f2122a with name osgi.core:type=framework,version=1.7,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:47,029 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.service.permissionadmin.PermissionAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@46f2122a with name osgi.core:service=permissionadmin,version=1.2,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:47,030 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.BundleStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@46f2122a with name osgi.core:type=bundleState,version=1.7,framework=org.eclipse.osgi,uuid=9c53ab35-3fc9-4fa4-a88c-87952efc5a76 2025-10-15T00:45:47,045 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8639/14.0.18 2025-10-15T00:45:47,045 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-binding-api/14.0.18 2025-10-15T00:45:47,046 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-eos-common-api/14.0.18 2025-10-15T00:45:47,046 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.model.general-entity/14.0.18 2025-10-15T00:45:47,046 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-eos-binding-api/14.0.18 2025-10-15T00:45:47,047 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-singleton-api/14.0.18 2025-10-15T00:45:47,047 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.api/0.20.1 2025-10-15T00:45:47,048 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.extension-api/0.20.1 2025-10-15T00:45:47,049 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin/0.20.1 2025-10-15T00:45:47,050 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9640/14.0.18 2025-10-15T00:45:47,051 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9642/14.0.18 2025-10-15T00:45:47,051 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.iana.tls-cipher-suite-algs/14.0.18 2025-10-15T00:45:47,051 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9645-ietf-tls-common/14.0.18 2025-10-15T00:45:47,052 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9641/14.0.18 2025-10-15T00:45:47,052 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9645-ietf-tls-server/14.0.18 2025-10-15T00:45:47,052 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.resolver/4.2.6.Final 2025-10-15T00:45:47,053 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.transport/4.2.6.Final 2025-10-15T00:45:47,053 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.transport-native-unix-common/4.2.6.Final 2025-10-15T00:45:47,054 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.codec-base/4.2.6.Final 2025-10-15T00:45:47,054 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.handler/4.2.6.Final 2025-10-15T00:45:47,054 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.model.sal-remote/9.0.1 2025-10-15T00:45:47,055 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | wrap_file__tmp_karaf-0.23.0_system_net_java_dev_stax-utils_stax-utils_20070216_stax-utils-20070216.jar/0.0.0 2025-10-15T00:45:47,055 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.google.gson/2.13.1 2025-10-15T00:45:47,056 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-codec-gson/14.0.17 2025-10-15T00:45:47,072 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | stax2-api/4.2.2 2025-10-15T00:45:47,072 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-codec-xml/14.0.17 2025-10-15T00:45:47,072 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.databind/9.0.1 2025-10-15T00:45:47,073 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-parser-api/14.0.17 2025-10-15T00:45:47,073 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-parser-spi/14.0.17 2025-10-15T00:45:47,074 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-dom-api/14.0.18 2025-10-15T00:45:47,074 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc6241/14.0.18 2025-10-15T00:45:47,074 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.netconf-api/9.0.1 2025-10-15T00:45:47,075 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.dom-api/9.0.1 2025-10-15T00:45:47,075 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.commons.lang3/3.18.0 2025-10-15T00:45:47,075 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | checker-qual/3.50.0 2025-10-15T00:45:47,076 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.infrautils.ready-api/7.1.7 2025-10-15T00:45:47,076 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.infrautils.diagstatus-api/7.1.7 2025-10-15T00:45:47,077 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.infrautils.util/7.1.7 2025-10-15T00:45:47,077 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-binding-spi/14.0.18 2025-10-15T00:45:47,077 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.transport-classes-epoll/4.2.6.Final 2025-10-15T00:45:47,078 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.openflowjava.openflow-protocol-spi/0.20.1 2025-10-15T00:45:47,078 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-common-netty/14.0.17 2025-10-15T00:45:47,078 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.odlparent.bundles-diag/14.1.3 2025-10-15T00:45:47,118 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.infrautils.ready-impl/7.1.7 2025-10-15T00:45:47,136 | INFO | features-3-thread-1 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.7 | ThreadFactory for SystemReadyService created 2025-10-15T00:45:47,137 | INFO | features-3-thread-1 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.7 | Now starting to provide full system readiness status updates (see TestBundleDiag's logs)... 2025-10-15T00:45:47,139 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.infrautils.diagstatus-impl/7.1.7 2025-10-15T00:45:47,139 | INFO | SystemReadyService-0 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.7 | checkBundleDiagInfos() started... 2025-10-15T00:45:47,146 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-1,name='default',path='/',bundle=org.ops4j.pax.web.pax-web-extender-whiteboard,context=(supplier)} to o.o.p.w.s.j.i.PaxWebServletContextHandler@673cef67{/,null,STOPPED} 2025-10-15T00:45:47,149 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@673cef67{/,null,STOPPED} 2025-10-15T00:45:47,152 | INFO | features-3-thread-1 | DiagStatusServiceImpl | 199 - org.opendaylight.infrautils.diagstatus-impl - 7.1.7 | Diagnostic Status Service started 2025-10-15T00:45:47,160 | INFO | features-3-thread-1 | MBeanUtils | 198 - org.opendaylight.infrautils.diagstatus-api - 7.1.7 | MBean registration for org.opendaylight.infrautils.diagstatus:type=SvcStatus SUCCESSFUL. 2025-10-15T00:45:47,160 | INFO | features-3-thread-1 | DiagStatusServiceMBeanImpl | 199 - org.opendaylight.infrautils.diagstatus-impl - 7.1.7 | Diagnostic Status Service management started 2025-10-15T00:45:47,161 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl/0.20.1 2025-10-15T00:45:47,175 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.common/0.20.1 2025-10-15T00:45:47,180 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.odlext-model-api/14.0.17 2025-10-15T00:45:47,180 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-dom-spi/14.0.18 2025-10-15T00:45:47,181 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-data-codec-api/14.0.17 2025-10-15T00:45:47,181 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-model/14.0.17 2025-10-15T00:45:47,181 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-runtime-api/14.0.17 2025-10-15T00:45:47,182 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-data-codec-spi/14.0.17 2025-10-15T00:45:47,182 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-loader/14.0.17 2025-10-15T00:45:47,182 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-data-codec-dynamic/14.0.17 2025-10-15T00:45:47,187 | INFO | features-3-thread-1 | SimpleBindingDOMCodecFactory | 325 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.17 | Binding/DOM Codec enabled 2025-10-15T00:45:47,187 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.odlext-parser-support/14.0.17 2025-10-15T00:45:47,190 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.openconfig-model-api/14.0.17 2025-10-15T00:45:47,190 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.openconfig-parser-support/14.0.17 2025-10-15T00:45:47,190 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc6241-model-api/14.0.17 2025-10-15T00:45:47,191 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc6241-parser-support/14.0.17 2025-10-15T00:45:47,191 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc6536-model-api/14.0.17 2025-10-15T00:45:47,192 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc6536-parser-support/14.0.17 2025-10-15T00:45:47,193 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc6643-model-api/14.0.17 2025-10-15T00:45:47,193 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc6643-parser-support/14.0.17 2025-10-15T00:45:47,193 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-model-ri/14.0.17 2025-10-15T00:45:47,194 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc7952-parser-support/14.0.17 2025-10-15T00:45:47,194 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8040-parser-support/14.0.17 2025-10-15T00:45:47,194 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8528-parser-support/14.0.17 2025-10-15T00:45:47,195 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8639-model-api/14.0.17 2025-10-15T00:45:47,195 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8639-parser-support/14.0.17 2025-10-15T00:45:47,195 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8819-model-api/14.0.17 2025-10-15T00:45:47,196 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8819-parser-support/14.0.17 2025-10-15T00:45:47,196 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.antlr.antlr4-runtime/4.13.2 2025-10-15T00:45:47,196 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-parser-reactor/14.0.17 2025-10-15T00:45:47,197 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-parser-rfc7950/14.0.17 2025-10-15T00:45:47,197 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-xpath-impl/14.0.17 2025-10-15T00:45:47,199 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-parser-impl/14.0.17 2025-10-15T00:45:47,203 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-runtime-spi/14.0.17 2025-10-15T00:45:47,204 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-generator/14.0.17 2025-10-15T00:45:47,214 | INFO | features-3-thread-1 | DefaultBindingRuntimeGenerator | 328 - org.opendaylight.yangtools.binding-generator - 14.0.17 | Binding/YANG type support activated 2025-10-15T00:45:47,215 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-runtime-osgi/14.0.17 2025-10-15T00:45:47,244 | INFO | features-3-thread-1 | OSGiBindingRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Binding Runtime activating 2025-10-15T00:45:47,246 | INFO | features-3-thread-1 | OSGiBindingRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Binding Runtime activated 2025-10-15T00:45:47,251 | INFO | features-3-thread-1 | OSGiModelRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Model Runtime starting 2025-10-15T00:45:47,344 | INFO | features-3-thread-1 | KarafFeaturesSupport | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Will attempt to integrate with Karaf FeaturesService 2025-10-15T00:45:48,768 | INFO | features-3-thread-1 | NettyTransportSupport | 284 - org.opendaylight.netconf.transport-api - 9.0.1 | Netty transport backed by epoll(2) 2025-10-15T00:45:48,991 | INFO | features-3-thread-1 | SharedEffectiveModelContextFactory | 379 - org.opendaylight.yangtools.yang-parser-impl - 14.0.17 | Using weak references 2025-10-15T00:45:51,245 | INFO | features-3-thread-1 | OSGiModuleInfoSnapshotImpl | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | EffectiveModelContext generation 1 activated 2025-10-15T00:45:52,006 | INFO | features-3-thread-1 | OSGiBindingRuntimeContextImpl | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | BindingRuntimeContext generation 1 activated 2025-10-15T00:45:52,007 | INFO | features-3-thread-1 | GlobalBindingRuntimeContext | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Global BindingRuntimeContext generation 1 activated 2025-10-15T00:45:52,007 | INFO | features-3-thread-1 | OSGiModelRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Model Runtime started 2025-10-15T00:45:52,008 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-data-codec-osgi/14.0.17 2025-10-15T00:45:52,017 | INFO | features-3-thread-1 | OSGiBindingDOMCodec | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.17 | Binding/DOM Codec activating 2025-10-15T00:45:52,039 | INFO | features-3-thread-1 | OSGiBindingDOMCodecServicesImpl | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.17 | Binding/DOM Codec generation 1 activated 2025-10-15T00:45:52,039 | INFO | features-3-thread-1 | GlobalBindingDOMCodecServices | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.17 | Global Binding/DOM Codec activated with generation 1 2025-10-15T00:45:52,041 | INFO | features-3-thread-1 | OSGiBindingDOMCodec | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.17 | Binding/DOM Codec activated 2025-10-15T00:45:52,041 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding-dom-adapter/14.0.18 2025-10-15T00:45:52,079 | INFO | features-3-thread-1 | OSGiBlockingBindingNormalizer | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter activated 2025-10-15T00:45:52,087 | INFO | features-3-thread-1 | DynamicBindingAdapter | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | 8 DOMService trackers started 2025-10-15T00:45:52,088 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-eos-dom-api/14.0.18 2025-10-15T00:45:52,821 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.eos-binding-adapter/14.0.18 2025-10-15T00:45:52,834 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-singleton-impl/14.0.18 2025-10-15T00:45:52,835 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.impl/0.20.1 2025-10-15T00:45:52,883 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationServiceFactory)] 2025-10-15T00:45:52,885 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.opendaylight.openflowplugin.impl/0.20.1. Missing service: [org.opendaylight.openflowplugin.api.openflow.statistics.ofpspecific.MessageIntelligenceAgency] 2025-10-15T00:45:52,891 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-15T00:45:52,895 | INFO | features-3-thread-1 | MessageIntelligenceAgencyImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Registered MBean org.opendaylight.openflowplugin.impl.statistics.ofpspecific:type=MessageIntelligenceAgencyMXBean 2025-10-15T00:45:52,896 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.openflowplugin.impl/0.20.1 2025-10-15T00:45:52,896 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.reconciliation-framework/0.20.1 2025-10-15T00:45:52,897 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.opendaylight.openflowplugin.applications.reconciliation-framework/0.20.1. Missing service: [org.opendaylight.openflowplugin.applications.reconciliation.ReconciliationManager] 2025-10-15T00:45:52,900 | INFO | features-3-thread-1 | ReconciliationManagerImpl | 302 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.20.1 | ReconciliationManager started 2025-10-15T00:45:52,901 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.openflowplugin.applications.reconciliation-framework/0.20.1 2025-10-15T00:45:52,901 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.arbitratorreconciliation-api/0.20.1 2025-10-15T00:45:52,902 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.arbitratorreconciliation-impl/0.20.1 2025-10-15T00:45:52,904 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.iana.ssh-public-key-algs/14.0.18 2025-10-15T00:45:52,904 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.codec-compression/4.2.6.Final 2025-10-15T00:45:52,904 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.h2database/2.3.232 2025-10-15T00:45:52,914 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.shaded-sshd/9.0.1 2025-10-15T00:45:52,915 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.shiro-api/0.21.2 2025-10-15T00:45:52,916 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.iana.ssh-encryption-algs/14.0.18 2025-10-15T00:45:52,916 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.iana.ssh-key-exchange-algs/14.0.18 2025-10-15T00:45:52,917 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.iana.ssh-mac-algs/14.0.18 2025-10-15T00:45:52,917 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9644-ietf-ssh-common/14.0.18 2025-10-15T00:45:52,918 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.model.ietf-topology/2013.10.21.26_18 2025-10-15T00:45:52,918 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.codec-http/4.2.6.Final 2025-10-15T00:45:52,919 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.codec-http2/4.2.6.Final 2025-10-15T00:45:52,919 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.commons.commons-codec/1.19.0 2025-10-15T00:45:52,919 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.transport-api/9.0.1 2025-10-15T00:45:52,920 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9643-ietf-tcp-common/14.0.18 2025-10-15T00:45:52,920 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9643-ietf-tcp-client/14.0.18 2025-10-15T00:45:52,920 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9643-ietf-tcp-server/14.0.18 2025-10-15T00:45:52,920 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.transport-tcp/9.0.1 2025-10-15T00:45:52,921 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9645-ietf-tls-client/14.0.18 2025-10-15T00:45:52,921 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.transport-tls/9.0.1 2025-10-15T00:45:52,921 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.iana.crypt-hash/14.0.18 2025-10-15T00:45:52,922 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.transport-http/9.0.1 2025-10-15T00:45:52,922 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.jolokia.osgi/1.7.2 2025-10-15T00:45:52,925 | INFO | features-3-thread-1 | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.jolokia.osgi_1.7.2 [155]] 2025-10-15T00:45:52,937 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-4,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@5ab5e77a,contexts=[{HS,OCM-5,context:750064721,/}]} 2025-10-15T00:45:52,937 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-4,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@5ab5e77a,contexts=null}", size=3} 2025-10-15T00:45:52,938 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{HS,id=OCM-5,name='context:750064721',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [155],contextId='context:750064721',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@2cb51451}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@673cef67{/,null,STOPPED} 2025-10-15T00:45:52,938 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@673cef67{/,null,STOPPED} 2025-10-15T00:45:52,939 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-4,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@5ab5e77a,contexts=[{HS,OCM-5,context:750064721,/}]} 2025-10-15T00:45:52,942 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/" with default Osgi Context OsgiContextModel{HS,id=OCM-5,name='context:750064721',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [155],contextId='context:750064721',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@2cb51451}} 2025-10-15T00:45:52,958 | INFO | paxweb-config-3-thread-1 | osgi | 155 - org.jolokia.osgi - 1.7.2 | No access restrictor found, access to any MBean is allowed 2025-10-15T00:45:52,981 | INFO | paxweb-config-3-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@673cef67{/,null,AVAILABLE} 2025-10-15T00:45:52,982 | INFO | paxweb-config-3-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{HS,id=OCM-5,name='context:750064721',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [155],contextId='context:750064721',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@2cb51451}}} as OSGi service for "/" context path 2025-10-15T00:45:52,985 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-transform/14.0.17 2025-10-15T00:45:52,986 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.password-service-api/0.21.2 2025-10-15T00:45:52,986 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.cds-access-api/11.0.2 2025-10-15T00:45:52,986 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-cluster-admin-api/11.0.2 2025-10-15T00:45:52,987 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.libraries.liblldp/0.20.1 2025-10-15T00:45:52,987 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 2025-10-15T00:45:52,992 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.mdsal.binding.api.NotificationPublishService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.binding.api.NotificationService)] 2025-10-15T00:45:52,996 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | jakarta.validation.jakarta.validation-api/2.0.2 2025-10-15T00:45:52,997 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.core.jersey-client/2.47.0 2025-10-15T00:45:52,998 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.core.jersey-server/2.47.0 2025-10-15T00:45:52,999 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-common-util/11.0.2 2025-10-15T00:45:52,999 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.commons.text/1.14.0 2025-10-15T00:45:53,000 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.cds-access-client/11.0.2 2025-10-15T00:45:53,002 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.cds-mgmt-api/11.0.2 2025-10-15T00:45:53,002 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.raft-journal/11.0.2 2025-10-15T00:45:53,005 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.authn-api/0.21.2 2025-10-15T00:45:53,006 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.tokenauthrealm/0.21.2 2025-10-15T00:45:53,009 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8072/14.0.18 2025-10-15T00:45:53,010 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.media.jersey-media-sse/2.47.0 2025-10-15T00:45:53,010 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javassist/3.30.2.GA 2025-10-15T00:45:53,010 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9644-ietf-ssh-client/14.0.18 2025-10-15T00:45:53,011 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8040-ietf-restconf-monitoring/14.0.18 2025-10-15T00:45:53,011 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc9644-ietf-ssh-server/14.0.18 2025-10-15T00:45:53,011 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc7407-ietf-x509-cert-to-name/14.0.18 2025-10-15T00:45:53,012 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.model.draft-ietf-restconf-server/9.0.1 2025-10-15T00:45:53,012 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.encrypt-service/0.21.2 2025-10-15T00:45:53,012 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.cert/0.21.2 2025-10-15T00:45:53,016 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService)] 2025-10-15T00:45:53,016 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.hk2.external.aopalliance-repackaged/2.6.1 2025-10-15T00:45:53,017 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.hk2.osgi-resource-locator/1.0.3 2025-10-15T00:45:53,105 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc6470/14.0.18 2025-10-15T00:45:53,106 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc6243/14.0.18 2025-10-15T00:45:53,106 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc7952/14.0.18 2025-10-15T00:45:53,106 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8342-ietf-origin/14.0.18 2025-10-15T00:45:53,106 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8526/14.0.18 2025-10-15T00:45:53,107 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8650/14.0.18 2025-10-15T00:45:53,107 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding.model.ietf.rfc8525/14.0.18 2025-10-15T00:45:53,107 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-dom-schema-osgi/14.0.18 2025-10-15T00:45:53,112 | INFO | features-3-thread-1 | OSGiDOMSchemaService | 251 - org.opendaylight.mdsal.mdsal-dom-schema-osgi - 14.0.18 | DOM Schema services activated 2025-10-15T00:45:53,112 | INFO | features-3-thread-1 | OSGiDOMSchemaService | 251 - org.opendaylight.mdsal.mdsal-dom-schema-osgi - 14.0.18 | Updating context to generation 1 2025-10-15T00:45:53,113 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.web.api/0.21.2 2025-10-15T00:45:53,113 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.web.servlet-api/0.21.2 2025-10-15T00:45:53,113 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-api/9.0.1 2025-10-15T00:45:53,114 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-server-api/9.0.1 2025-10-15T00:45:53,114 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.netconf-common-mdsal/9.0.1 2025-10-15T00:45:53,114 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-model-export/14.0.17 2025-10-15T00:45:53,115 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-server-spi/9.0.1 2025-10-15T00:45:53,115 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.web.osgi-impl/0.21.2 2025-10-15T00:45:53,117 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.web.servlet-jersey2/0.21.2 2025-10-15T00:45:53,125 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-mdsal-spi/9.0.1 2025-10-15T00:45:53,126 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-dom-broker/14.0.18 2025-10-15T00:45:53,138 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for MountPointService activated 2025-10-15T00:45:53,147 | INFO | features-3-thread-1 | DOMRpcRouter | 250 - org.opendaylight.mdsal.mdsal-dom-broker - 14.0.18 | DOM RPC/Action router started 2025-10-15T00:45:53,152 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for ActionProviderService activated 2025-10-15T00:45:53,155 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for ActionService activated 2025-10-15T00:45:53,161 | INFO | features-3-thread-1 | DOMNotificationRouter | 250 - org.opendaylight.mdsal.mdsal-dom-broker - 14.0.18 | DOM Notification Router started 2025-10-15T00:45:53,164 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.mdsal.binding.api.NotificationPublishService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-15T00:45:53,164 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for NotificationService activated 2025-10-15T00:45:53,168 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-15T00:45:53,168 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for NotificationPublishService activated 2025-10-15T00:45:53,172 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService)] 2025-10-15T00:45:53,172 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for RpcProviderService activated 2025-10-15T00:45:53,175 | INFO | features-3-thread-1 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for RpcService activated 2025-10-15T00:45:53,175 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.odl-device-notification/9.0.1 2025-10-15T00:45:53,177 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.sal-remote-impl/9.0.1 2025-10-15T00:45:53,180 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-subscription/9.0.1 2025-10-15T00:45:53,187 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-server-mdsal/9.0.1 2025-10-15T00:45:53,193 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-server-jaxrs/9.0.1 2025-10-15T00:45:53,198 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.yanglib-mdsal-writer/9.0.1 2025-10-15T00:45:53,233 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.extension-onf/0.20.1 2025-10-15T00:45:53,237 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.srm-api/0.20.1 2025-10-15T00:45:53,238 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.encrypt-service-impl/0.21.2 2025-10-15T00:45:53,242 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.device-ownership-service/0.20.1 2025-10-15T00:45:53,244 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 2025-10-15T00:45:53,252 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.openflowplugin.applications.deviceownershipservice.DeviceOwnershipService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-15T00:45:53,253 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.rabbitmq.client/5.26.0 2025-10-15T00:45:53,277 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.containers.jersey-container-servlet/2.47.0 2025-10-15T00:45:53,278 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.googlecode.json-simple/1.1.1 2025-10-15T00:45:53,279 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.atomix-storage/11.0.2 2025-10-15T00:45:53,282 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.dropwizard.metrics.jvm/4.2.36 2025-10-15T00:45:53,284 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.bulk-o-matic/0.20.1 2025-10-15T00:45:53,286 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.util/1.1.3 2025-10-15T00:45:53,287 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.dropwizard.metrics.healthchecks/4.2.36 2025-10-15T00:45:53,288 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.srm-impl/0.20.1 2025-10-15T00:45:53,297 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 2025-10-15T00:45:53,305 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-10-15T00:45:53,315 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-10-15T00:45:53,315 | INFO | features-3-thread-1 | OpenflowServiceRecoveryHandlerImpl | 299 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.20.1 | Registering openflowplugin service recovery handlers 2025-10-15T00:45:53,316 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.jdbc.core/4.4.8 2025-10-15T00:45:53,327 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.jdbc.core/4.4.8 2025-10-15T00:45:53,328 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.infrautils.diagstatus-shell/7.1.7 2025-10-15T00:45:53,330 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.infrautils.diagstatus-shell/7.1.7 2025-10-15T00:45:53,330 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-server/9.0.1 2025-10-15T00:45:53,333 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-nb/9.0.1 2025-10-15T00:45:53,337 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.openflowjava.blueprint-config/0.20.1 2025-10-15T00:45:53,339 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.model.rfc5277/9.0.1 2025-10-15T00:45:53,340 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | karaf.branding/14.1.3 2025-10-15T00:45:53,340 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.transport-ssh/9.0.1 2025-10-15T00:45:53,340 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-repo-fs/14.0.17 2025-10-15T00:45:53,340 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.srm-shell/0.20.1 2025-10-15T00:45:53,342 | INFO | features-3-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.opendaylight.openflowplugin.srm-shell/0.20.1. Missing service: [org.opendaylight.mdsal.binding.api.DataBroker, org.opendaylight.serviceutils.srm.spi.RegistryControl] 2025-10-15T00:45:53,342 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.password-service-impl/0.21.2 2025-10-15T00:45:53,346 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.idm-store-h2/0.21.2 2025-10-15T00:45:53,347 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.jetty-auth-log-filter/0.21.2 2025-10-15T00:45:53,348 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.of-switch-config-pusher/0.20.1 2025-10-15T00:45:53,350 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-tree-ri/14.0.17 2025-10-15T00:45:53,351 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.truststore-none/9.0.1 2025-10-15T00:45:53,352 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.model.topology/0.20.1 2025-10-15T00:45:53,353 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.keystore-none/9.0.1 2025-10-15T00:45:53,353 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.inject.jersey-hk2/2.47.0 2025-10-15T00:45:53,353 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.topology-manager/0.20.1 2025-10-15T00:45:53,357 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.jspecify.jspecify/1.0.0 2025-10-15T00:45:53,357 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.dropwizard.metrics.graphite/4.2.36 2025-10-15T00:45:53,358 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding-util/14.0.18 2025-10-15T00:45:53,358 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.cds-dom-api/11.0.2 2025-10-15T00:45:53,359 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-akka-segmented-journal/11.0.2 2025-10-15T00:45:53,359 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.hk2.locator/2.6.1 2025-10-15T00:45:53,360 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.hk2.utils/2.6.1 2025-10-15T00:45:53,361 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.hk2.api/2.6.1 2025-10-15T00:45:53,362 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.filterchain/0.21.2 2025-10-15T00:45:53,366 | INFO | features-3-thread-1 | CustomFilterAdapterConfigurationImpl | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Custom filter properties updated: {service.pid=org.opendaylight.aaa.filterchain, osgi.ds.satisfying.condition.target=(osgi.condition.id=true), customFilterList=, component.name=org.opendaylight.aaa.filterchain.configuration.impl.CustomFilterAdapterConfigurationImpl, felix.fileinstall.filename=file:/tmp/karaf-0.23.0/etc/org.opendaylight.aaa.filterchain.cfg, component.id=110, Filter.target=(org.opendaylight.aaa.filterchain.filter=true)} 2025-10-15T00:45:53,367 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.shiro/0.21.2 2025-10-15T00:45:53,371 | INFO | features-3-thread-1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.api.IIDMStore), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth)] 2025-10-15T00:45:53,382 | INFO | features-3-thread-1 | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.opendaylight.aaa.shiro_0.21.2 [172]] 2025-10-15T00:45:53,383 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]} 2025-10-15T00:45:53,383 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]}", size=1} 2025-10-15T00:45:53,383 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]} 2025-10-15T00:45:53,384 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-akka-raft/11.0.2 2025-10-15T00:45:53,388 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-distributed-datastore/11.0.2 2025-10-15T00:45:53,401 | INFO | features-3-thread-1 | FileModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Shard configuration provider started 2025-10-15T00:45:53,401 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-remoterpc-connector/11.0.2 2025-10-15T00:45:53,403 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | jakarta.activation-api/1.2.2 2025-10-15T00:45:53,406 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.core.jersey-common/2.47.0 2025-10-15T00:45:53,409 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | jakarta.ws.rs-api/2.1.6 2025-10-15T00:45:53,410 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.containers.jersey-container-servlet-core/2.47.0 2025-10-15T00:45:53,411 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-clustering-commons/11.0.2 2025-10-15T00:45:53,419 | INFO | features-3-thread-1 | FileAkkaConfigurationReader | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | File-based Pekko configuration reader enabled 2025-10-15T00:45:53,419 | INFO | features-3-thread-1 | OSGiActorSystemProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Actor System provider starting 2025-10-15T00:45:53,589 | INFO | features-3-thread-1 | ActorSystemProviderImpl | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Creating new ActorSystem 2025-10-15T00:45:53,968 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Slf4jLogger | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Slf4jLogger started 2025-10-15T00:45:54,205 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ArteryTransport | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Remoting started with transport [Artery tcp]; listening on address [pekko://opendaylight-cluster-data@10.30.170.142:2550] with UID [9130694333926362031] 2025-10-15T00:45:54,222 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Starting up, Pekko version [1.0.3] ... 2025-10-15T00:45:54,271 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Registered cluster JMX MBean [pekko:type=Cluster] 2025-10-15T00:45:54,272 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Started up successfully 2025-10-15T00:45:54,305 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | SBR started. Config: strategy [KeepMajority], stable-after [7 seconds], down-all-when-unstable [5250 milliseconds], selfUniqueAddress [pekko://opendaylight-cluster-data@10.30.170.142:2550#9130694333926362031], selfDc [default]. 2025-10-15T00:45:54,513 | INFO | features-3-thread-1 | OSGiActorSystemProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Actor System provider started 2025-10-15T00:45:54,515 | INFO | features-3-thread-1 | OSGiDatastoreContextIntrospectorFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Datastore Context Introspector activated 2025-10-15T00:45:54,517 | INFO | features-3-thread-1 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Distributed Datastore type CONFIGURATION starting 2025-10-15T00:45:54,692 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-124536349]], but this node is not initialized yet 2025-10-15T00:45:54,846 | INFO | features-3-thread-1 | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Create data store instance of type : config 2025-10-15T00:45:54,847 | INFO | features-3-thread-1 | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Config file exists - reading config from it 2025-10-15T00:45:54,848 | INFO | features-3-thread-1 | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Config file exists - reading config from it 2025-10-15T00:45:54,854 | INFO | features-3-thread-1 | AbstractDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Creating ShardManager : shardmanager-config 2025-10-15T00:45:54,938 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Starting ShardManager shard-manager-config 2025-10-15T00:45:54,951 | INFO | opendaylight-cluster-data-pekko.persistence.dispatchers.default-plugin-dispatcher-32 | SegmentedFileJournal | 191 - org.opendaylight.controller.sal-akka-segmented-journal - 11.0.2 | Initialized with root directory segmented-journal with storage MAPPED 2025-10-15T00:45:55,013 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Recovery complete 2025-10-15T00:45:55,047 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RecoveringClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: saving tombstone ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0} 2025-10-15T00:45:55,146 | INFO | features-3-thread-1 | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Data store config is using tell-based protocol 2025-10-15T00:45:55,154 | INFO | features-3-thread-1 | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Config file exists - reading config from it 2025-10-15T00:45:55,155 | INFO | features-3-thread-1 | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Config file exists - reading config from it 2025-10-15T00:45:55,155 | INFO | features-3-thread-1 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Distributed Datastore type OPERATIONAL starting 2025-10-15T00:45:55,162 | INFO | features-3-thread-1 | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Create data store instance of type : operational 2025-10-15T00:45:55,162 | INFO | features-3-thread-1 | AbstractDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Creating ShardManager : shardmanager-operational 2025-10-15T00:45:55,164 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Starting ShardManager shard-manager-operational 2025-10-15T00:45:55,173 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-topology-config: Shard created, persistent : true 2025-10-15T00:45:55,173 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-default-config: Shard created, persistent : true 2025-10-15T00:45:55,173 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: Shard created, persistent : true 2025-10-15T00:45:55,174 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-toaster-config: Shard created, persistent : true 2025-10-15T00:45:55,174 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Recovery complete 2025-10-15T00:45:55,219 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config: Starting recovery with journal batch size 1 2025-10-15T00:45:55,219 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config: Starting recovery with journal batch size 1 2025-10-15T00:45:55,219 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Starting recovery with journal batch size 1 2025-10-15T00:45:55,239 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-config/member-2-shard-topology-config/member-2-shard-topology-config-notifier#1729200481 created and ready for shard:member-2-shard-topology-config 2025-10-15T00:45:55,239 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-config/member-2-shard-toaster-config/member-2-shard-toaster-config-notifier#463989556 created and ready for shard:member-2-shard-toaster-config 2025-10-15T00:45:55,240 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-config/member-2-shard-default-config/member-2-shard-default-config-notifier#2024923431 created and ready for shard:member-2-shard-default-config 2025-10-15T00:45:55,240 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-config/member-2-shard-inventory-config/member-2-shard-inventory-config-notifier#-1402299369 created and ready for shard:member-2-shard-inventory-config 2025-10-15T00:45:55,241 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config: Starting recovery with journal batch size 1 2025-10-15T00:45:55,245 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/system/cluster/core/daemon/firstSeedNodeProcess-1#-1191913206]], but this node is not initialized yet 2025-10-15T00:45:55,278 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/system/cluster/core/daemon/firstSeedNodeProcess-1#-1191913206]], but this node is not initialized yet 2025-10-15T00:45:55,279 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/system/cluster/core/daemon/firstSeedNodeProcess-1#-1191913206]], but this node is not initialized yet 2025-10-15T00:45:55,279 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/system/cluster/core/daemon/firstSeedNodeProcess-1#-1191913206]], but this node is not initialized yet 2025-10-15T00:45:55,337 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | RecoveringClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: saving tombstone ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=0} 2025-10-15T00:45:55,341 | INFO | features-3-thread-1 | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Data store operational is using tell-based protocol 2025-10-15T00:45:55,391 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-default-operational: Shard created, persistent : false 2025-10-15T00:45:55,392 | INFO | features-3-thread-1 | OSGiRemoteOpsProvider | 197 - org.opendaylight.controller.sal-remoterpc-connector - 11.0.2 | Remote Operations service starting 2025-10-15T00:45:55,393 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-operational/member-2-shard-default-operational/member-2-shard-default-operational-notifier#-1503705461 created and ready for shard:member-2-shard-default-operational 2025-10-15T00:45:55,394 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Starting recovery with journal batch size 1 2025-10-15T00:45:55,400 | INFO | features-3-thread-1 | OSGiRemoteOpsProvider | 197 - org.opendaylight.controller.sal-remoterpc-connector - 11.0.2 | Remote Operations service started 2025-10-15T00:45:55,401 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.eos-dom-akka/11.0.2 2025-10-15T00:45:55,403 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-topology-operational: Shard created, persistent : false 2025-10-15T00:45:55,403 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Starting recovery with journal batch size 1 2025-10-15T00:45:55,403 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-operational/member-2-shard-topology-operational/member-2-shard-topology-operational-notifier#-1957218009 created and ready for shard:member-2-shard-topology-operational 2025-10-15T00:45:55,404 | INFO | opendaylight-cluster-data-pekko.persistence.dispatchers.default-plugin-dispatcher-45 | SegmentedFileJournal | 191 - org.opendaylight.controller.sal-akka-segmented-journal - 11.0.2 | Initialized with root directory segmented-journal with storage DISK 2025-10-15T00:45:55,414 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-operational: Shard created, persistent : false 2025-10-15T00:45:55,415 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-operational/member-2-shard-inventory-operational/member-2-shard-inventory-operational-notifier#-1916212181 created and ready for shard:member-2-shard-inventory-operational 2025-10-15T00:45:55,415 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Starting recovery with journal batch size 1 2025-10-15T00:45:55,416 | INFO | opendaylight-cluster-data-shard-dispatcher-47 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-toaster-operational: Shard created, persistent : false 2025-10-15T00:45:55,417 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-operational/member-2-shard-toaster-operational/member-2-shard-toaster-operational-notifier#88792400 created and ready for shard:member-2-shard-toaster-operational 2025-10-15T00:45:55,417 | INFO | opendaylight-cluster-data-shard-dispatcher-47 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational: Starting recovery with journal batch size 1 2025-10-15T00:45:55,522 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-15T00:45:55,523 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-15T00:45:55,592 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.common/4.2.6.Final 2025-10-15T00:45:55,614 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.buffer/4.2.6.Final 2025-10-15T00:45:55,614 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.openflowjava.util/0.20.1 2025-10-15T00:45:55,620 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.keystore-api/9.0.1 2025-10-15T00:45:55,620 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.blueprint-config/0.20.1 2025-10-15T00:45:55,621 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-cluster-admin-impl/11.0.2 2025-10-15T00:45:55,625 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.truststore-api/9.0.1 2025-10-15T00:45:55,643 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: journal open: applyTo=0 2025-10-15T00:45:55,644 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: journal open: applyTo=0 2025-10-15T00:45:55,644 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: journal open: applyTo=0 2025-10-15T00:45:55,644 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: journal open: applyTo=0 2025-10-15T00:45:55,643 | INFO | opendaylight-cluster-data-shard-dispatcher-47 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational: journal open: applyTo=0 2025-10-15T00:45:55,645 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config: journal open: applyTo=0 2025-10-15T00:45:55,646 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config: journal open: applyTo=0 2025-10-15T00:45:55,646 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config: journal open: applyTo=0 2025-10-15T00:45:55,648 | INFO | features-3-thread-1 | Activator | 99 - org.apache.karaf.deployer.features - 4.4.8 | Deployment finished. Registering FeatureDeploymentListener 2025-10-15T00:45:55,675 | INFO | opendaylight-cluster-data-shard-dispatcher-47 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-15T00:45:55,675 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-15T00:45:55,676 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-15T00:45:55,676 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-15T00:45:55,677 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-15T00:45:55,677 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-15T00:45:55,677 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-15T00:45:55,680 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-15T00:45:56,061 | INFO | opendaylight-cluster-data-shard-dispatcher-47 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config: Local TermInfo store seeded with TermInfo{term=0} 2025-10-15T00:45:56,185 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-default-config , received role change from null to Follower 2025-10-15T00:45:56,191 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-default-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-10-15T00:45:56,191 | INFO | opendaylight-cluster-data-shard-dispatcher-47 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-default-config from null to Follower 2025-10-15T00:45:56,422 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Local TermInfo store seeded with TermInfo{term=0} 2025-10-15T00:45:56,422 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Local TermInfo store seeded with TermInfo{term=0} 2025-10-15T00:45:56,423 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Local TermInfo store seeded with TermInfo{term=0} 2025-10-15T00:45:56,423 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-inventory-operational , received role change from null to Follower 2025-10-15T00:45:56,423 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config: Local TermInfo store seeded with TermInfo{term=0} 2025-10-15T00:45:56,424 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-topology-operational , received role change from null to Follower 2025-10-15T00:45:56,424 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-inventory-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-10-15T00:45:56,424 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational: Local TermInfo store seeded with TermInfo{term=0} 2025-10-15T00:45:56,425 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-toaster-operational , received role change from null to Follower 2025-10-15T00:45:56,425 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-toaster-config , received role change from null to Follower 2025-10-15T00:45:56,425 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Local TermInfo store seeded with TermInfo{term=0} 2025-10-15T00:45:56,425 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-toaster-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-10-15T00:45:56,425 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-inventory-operational from null to Follower 2025-10-15T00:45:56,426 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-default-operational , received role change from null to Follower 2025-10-15T00:45:56,426 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-toaster-config from null to Follower 2025-10-15T00:45:56,426 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-default-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-10-15T00:45:56,426 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-toaster-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-10-15T00:45:56,426 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-inventory-config , received role change from null to Follower 2025-10-15T00:45:56,427 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config: Local TermInfo store seeded with TermInfo{term=0} 2025-10-15T00:45:56,427 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-inventory-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-10-15T00:45:56,427 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-default-operational from null to Follower 2025-10-15T00:45:56,427 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-toaster-operational from null to Follower 2025-10-15T00:45:56,427 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-inventory-config from null to Follower 2025-10-15T00:45:56,428 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-topology-config , received role change from null to Follower 2025-10-15T00:45:56,428 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-topology-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-10-15T00:45:56,428 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-topology-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-10-15T00:45:56,429 | INFO | opendaylight-cluster-data-shard-dispatcher-47 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-topology-config from null to Follower 2025-10-15T00:45:56,429 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-topology-operational from null to Follower 2025-10-15T00:45:57,299 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Done. 2025-10-15T00:46:00,602 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | CandidateRegistryInit | 185 - org.opendaylight.controller.eos-dom-akka - 11.0.2 | member-2 : Initial removal of candidates from previous iteration failed. Rescheduling. java.util.concurrent.TimeoutException: Ask timed out on [Actor[pekko://opendaylight-cluster-data/system/singletonProxyOwnerSupervisor-no-dc#-626115851]] after [5000 ms]. Message of type [org.opendaylight.controller.eos.akka.owner.supervisor.command.ClearCandidatesForMember]. A typical reason for `AskTimeoutException` is that the recipient actor didn't send a reply. at org.apache.pekko.actor.typed.scaladsl.AskPattern$.$anonfun$onTimeout$1(AskPattern.scala:141) ~[bundleFile:?] at org.apache.pekko.pattern.PromiseActorRef$.$anonfun$apply$1(AskSupport.scala:737) ~[bundleFile:?] at org.apache.pekko.actor.Scheduler$$anon$7.run(Scheduler.scala:491) ~[bundleFile:?] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(LightArrayRevolverScheduler.scala:384) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.executeBucket$1(LightArrayRevolverScheduler.scala:332) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.nextTick(LightArrayRevolverScheduler.scala:336) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.run(LightArrayRevolverScheduler.scala:288) ~[bundleFile:?] at java.lang.Thread.run(Thread.java:1583) ~[?:?] 2025-10-15T00:46:05,620 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | CandidateRegistryInit | 185 - org.opendaylight.controller.eos-dom-akka - 11.0.2 | member-2 : Initial removal of candidates from previous iteration failed. Rescheduling. java.util.concurrent.TimeoutException: Ask timed out on [Actor[pekko://opendaylight-cluster-data/system/singletonProxyOwnerSupervisor-no-dc#-626115851]] after [5000 ms]. Message of type [org.opendaylight.controller.eos.akka.owner.supervisor.command.ClearCandidatesForMember]. A typical reason for `AskTimeoutException` is that the recipient actor didn't send a reply. at org.apache.pekko.actor.typed.scaladsl.AskPattern$.$anonfun$onTimeout$1(AskPattern.scala:141) ~[bundleFile:?] at org.apache.pekko.pattern.PromiseActorRef$.$anonfun$apply$1(AskSupport.scala:737) ~[bundleFile:?] at org.apache.pekko.actor.Scheduler$$anon$7.run(Scheduler.scala:491) ~[bundleFile:?] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(LightArrayRevolverScheduler.scala:384) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.executeBucket$1(LightArrayRevolverScheduler.scala:332) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.nextTick(LightArrayRevolverScheduler.scala:336) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.run(LightArrayRevolverScheduler.scala:288) ~[bundleFile:?] at java.lang.Thread.run(Thread.java:1583) ~[?:?] 2025-10-15T00:46:06,341 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config (Candidate): Starting new election term 1 2025-10-15T00:46:06,342 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2025-10-15T00:46:06,342 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-default-config , received role change from Follower to Candidate 2025-10-15T00:46:06,342 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-default-config from Follower to Candidate 2025-10-15T00:46:06,486 | INFO | opendaylight-cluster-data-shard-dispatcher-47 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Candidate): Starting new election term 1 2025-10-15T00:46:06,487 | INFO | opendaylight-cluster-data-shard-dispatcher-47 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2025-10-15T00:46:06,487 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-default-operational , received role change from Follower to Candidate 2025-10-15T00:46:06,488 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-default-operational from Follower to Candidate 2025-10-15T00:46:06,519 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Candidate): Starting new election term 1 2025-10-15T00:46:06,519 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2025-10-15T00:46:06,520 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-inventory-operational , received role change from Follower to Candidate 2025-10-15T00:46:06,520 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-inventory-operational from Follower to Candidate 2025-10-15T00:46:06,571 | INFO | opendaylight-cluster-data-shard-dispatcher-47 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config (Candidate): Starting new election term 1 2025-10-15T00:46:06,571 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config (Candidate): Starting new election term 1 2025-10-15T00:46:06,571 | INFO | opendaylight-cluster-data-shard-dispatcher-47 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2025-10-15T00:46:06,571 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2025-10-15T00:46:06,571 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-topology-config , received role change from Follower to Candidate 2025-10-15T00:46:06,572 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-toaster-config , received role change from Follower to Candidate 2025-10-15T00:46:06,572 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-topology-config from Follower to Candidate 2025-10-15T00:46:06,572 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-toaster-config from Follower to Candidate 2025-10-15T00:46:06,616 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config (Candidate): Starting new election term 1 2025-10-15T00:46:06,616 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Candidate): Starting new election term 1 2025-10-15T00:46:06,616 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2025-10-15T00:46:06,617 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2025-10-15T00:46:06,617 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational (Candidate): Starting new election term 1 2025-10-15T00:46:06,617 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2025-10-15T00:46:06,617 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-toaster-operational , received role change from Follower to Candidate 2025-10-15T00:46:06,617 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-topology-operational , received role change from Follower to Candidate 2025-10-15T00:46:06,617 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-inventory-config , received role change from Follower to Candidate 2025-10-15T00:46:06,617 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-toaster-operational from Follower to Candidate 2025-10-15T00:46:06,618 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-topology-operational from Follower to Candidate 2025-10-15T00:46:06,618 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-inventory-config from Follower to Candidate 2025-10-15T00:46:07,210 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-124536349]], but this node is not initialized yet 2025-10-15T00:46:07,295 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Received InitJoinAck message from [Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/system/cluster/core/daemon#-1019253198]] to [pekko://opendaylight-cluster-data@10.30.170.142:2550] 2025-10-15T00:46:07,341 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Welcome from [pekko://opendaylight-cluster-data@10.30.170.193:2550] 2025-10-15T00:46:07,346 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.170.193:2550 2025-10-15T00:46:07,346 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-default-operational 2025-10-15T00:46:07,347 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-topology-operational 2025-10-15T00:46:07,347 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-inventory-operational 2025-10-15T00:46:07,347 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-toaster-operational 2025-10-15T00:46:07,348 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Peer address for peer member-1-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-topology-operational 2025-10-15T00:46:07,348 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.170.193:2550 2025-10-15T00:46:07,348 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-default-config 2025-10-15T00:46:07,348 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational: Peer address for peer member-1-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-toaster-operational 2025-10-15T00:46:07,349 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config: Peer address for peer member-1-shard-default-config set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-default-config 2025-10-15T00:46:07,348 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Peer address for peer member-1-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-inventory-operational 2025-10-15T00:46:07,348 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-topology-config 2025-10-15T00:46:07,349 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config 2025-10-15T00:46:07,350 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-toaster-config 2025-10-15T00:46:07,350 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config: Peer address for peer member-1-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-topology-config 2025-10-15T00:46:07,350 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Peer address for peer member-1-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config 2025-10-15T00:46:07,350 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config: Peer address for peer member-1-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-toaster-config 2025-10-15T00:46:07,348 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Peer address for peer member-1-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-default-operational 2025-10-15T00:46:07,352 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#-696520699] was unhandled. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T00:46:07,352 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#-696520699] was unhandled. [2] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T00:46:07,353 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#-2100128858] was unhandled. [3] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T00:46:07,353 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#-2100128858] was unhandled. [4] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T00:46:07,396 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ClusterSingletonProxy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Singleton identified at [pekko://opendaylight-cluster-data@10.30.170.193:2550/system/singletonManagerOwnerSupervisor/OwnerSupervisor] 2025-10-15T00:46:07,446 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | EmptyLocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.eos.akka.owner.supervisor.command.ClearCandidatesResponse] to Actor[pekko://opendaylight-cluster-data/temp/$a] was not delivered. [5] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/temp/$a] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T00:46:07,446 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | EmptyLocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.eos.akka.owner.supervisor.command.ClearCandidatesResponse] to Actor[pekko://opendaylight-cluster-data/temp/$b] was not delivered. [6] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/temp/$b] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T00:46:08,565 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.170.126:2550 2025-10-15T00:46:08,565 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.170.126:2550 2025-10-15T00:46:08,565 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-15T00:46:08,565 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-15T00:46:08,565 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-15T00:46:08,566 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-15T00:46:08,566 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config: Peer address for peer member-3-shard-default-config set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-15T00:46:08,566 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config: Peer address for peer member-3-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-15T00:46:08,567 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.142:2550 2025-10-15T00:46:08,567 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-config/member-2-shard-default-config 2025-10-15T00:46:08,567 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-config/member-2-shard-topology-config 2025-10-15T00:46:08,567 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-config/member-2-shard-inventory-config 2025-10-15T00:46:08,567 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-config/member-2-shard-toaster-config 2025-10-15T00:46:08,566 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Peer address for peer member-3-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-15T00:46:08,566 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-15T00:46:08,567 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-15T00:46:08,568 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-15T00:46:08,568 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Peer address for peer member-3-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-15T00:46:08,568 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Peer address for peer member-3-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-15T00:46:08,566 | INFO | opendaylight-cluster-data-shard-dispatcher-47 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config: Peer address for peer member-3-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-15T00:46:08,569 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-15T00:46:08,569 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Peer address for peer member-3-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-15T00:46:08,569 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational: Peer address for peer member-3-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-15T00:46:08,569 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.142:2550 2025-10-15T00:46:08,569 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-operational/member-2-shard-default-operational 2025-10-15T00:46:08,569 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-operational/member-2-shard-topology-operational 2025-10-15T00:46:08,569 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2025-10-15T00:46:08,569 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2025-10-15T00:46:08,574 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | ClusterSingletonManager state change [Start -> Younger] 2025-10-15T00:46:15,344 | INFO | opendaylight-cluster-data-shard-dispatcher-47 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-1-shard-toaster-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2025-10-15T00:46:15,344 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-1-shard-topology-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2025-10-15T00:46:15,344 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-1-shard-default-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2025-10-15T00:46:15,345 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-1-shard-default-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2025-10-15T00:46:15,345 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-1-shard-topology-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2025-10-15T00:46:15,345 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-1-shard-inventory-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2025-10-15T00:46:15,345 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-1-shard-inventory-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2025-10-15T00:46:15,346 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-1-shard-toaster-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2025-10-15T00:46:16,141 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2025-10-15T00:46:16,141 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2025-10-15T00:46:16,141 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-default-config , received role change from Candidate to Follower 2025-10-15T00:46:16,141 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-topology-operational , received role change from Candidate to Follower 2025-10-15T00:46:16,142 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-default-config from Candidate to Follower 2025-10-15T00:46:16,142 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-topology-operational from Candidate to Follower 2025-10-15T00:46:16,144 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2025-10-15T00:46:16,144 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2025-10-15T00:46:16,144 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-toaster-config , received role change from Candidate to Follower 2025-10-15T00:46:16,144 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2025-10-15T00:46:16,145 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-inventory-operational , received role change from Candidate to Follower 2025-10-15T00:46:16,145 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-toaster-config from Candidate to Follower 2025-10-15T00:46:16,145 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-inventory-operational from Candidate to Follower 2025-10-15T00:46:16,145 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2025-10-15T00:46:16,144 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2025-10-15T00:46:16,145 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-inventory-config , received role change from Candidate to Follower 2025-10-15T00:46:16,145 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-topology-config , received role change from Candidate to Follower 2025-10-15T00:46:16,145 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-default-operational , received role change from Candidate to Follower 2025-10-15T00:46:16,145 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-inventory-config from Candidate to Follower 2025-10-15T00:46:16,145 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-default-operational from Candidate to Follower 2025-10-15T00:46:16,144 | INFO | opendaylight-cluster-data-shard-dispatcher-47 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2025-10-15T00:46:16,145 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-topology-config from Candidate to Follower 2025-10-15T00:46:16,146 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-toaster-operational , received role change from Candidate to Follower 2025-10-15T00:46:16,146 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-toaster-operational from Candidate to Follower 2025-10-15T00:46:16,176 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@2c858571 2025-10-15T00:46:16,177 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-2-shard-default-config status sync done false 2025-10-15T00:46:16,177 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@4bd856bf 2025-10-15T00:46:16,178 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@1d965eb9 2025-10-15T00:46:16,178 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-2-shard-default-operational status sync done false 2025-10-15T00:46:16,178 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-2-shard-inventory-operational status sync done false 2025-10-15T00:46:16,178 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-2-shard-topology-operational status sync done false 2025-10-15T00:46:16,178 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@64a6745a 2025-10-15T00:46:16,179 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@22298b6a 2025-10-15T00:46:16,180 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@11ceaf2a 2025-10-15T00:46:16,180 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-2-shard-toaster-config status sync done false 2025-10-15T00:46:16,180 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@45f72121 2025-10-15T00:46:16,180 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-2-shard-topology-config status sync done false 2025-10-15T00:46:16,180 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: All Shards are ready - data store operational is ready 2025-10-15T00:46:16,181 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-2-shard-toaster-operational status sync done false 2025-10-15T00:46:16,183 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OSGiDOMStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Datastore service type OPERATIONAL activated 2025-10-15T00:46:16,183 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Distributed Datastore type OPERATIONAL started 2025-10-15T00:46:16,224 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@6900b886 2025-10-15T00:46:16,224 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: All Shards are ready - data store config is ready 2025-10-15T00:46:16,225 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-2-shard-inventory-config status sync done false 2025-10-15T00:46:16,244 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ConcurrentDOMDataBroker | 358 - org.opendaylight.yangtools.util - 14.0.17 | ThreadFactory created: CommitFutures 2025-10-15T00:46:16,246 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | DataBrokerCommitExecutor | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | DOM Data Broker commit exector started 2025-10-15T00:46:16,246 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | OSGiDOMStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Datastore service type CONFIGURATION activated 2025-10-15T00:46:16,248 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ConcurrentDOMDataBroker | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | DOM Data Broker started 2025-10-15T00:46:16,252 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for DataBroker activated 2025-10-15T00:46:16,349 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-default-config#1448842863], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2025-10-15T00:46:16,349 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, cookie=0} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-default-config#1448842863], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2025-10-15T00:46:16,361 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ArbitratorReconciliationManagerImpl | 296 - org.opendaylight.openflowplugin.applications.arbitratorreconciliation-impl - 0.20.1 | ArbitratorReconciliationManager has started successfully. 2025-10-15T00:46:16,363 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-15T00:46:16,362 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, cookie=0} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-default-config#1448842863], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 13.44 ms 2025-10-15T00:46:16,382 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 is waiting for dependencies [Initial app config AaaCertServiceConfig, (objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService)] 2025-10-15T00:46:16,386 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | EOSClusterSingletonServiceProvider | 257 - org.opendaylight.mdsal.mdsal-singleton-impl - 14.0.18 | Cluster Singleton Service started 2025-10-15T00:46:16,391 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-default-operational#-1199572057], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2025-10-15T00:46:16,391 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=0}, cookie=0} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-default-operational#-1199572057], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2025-10-15T00:46:16,392 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | YangLibraryWriterSingleton | 291 - org.opendaylight.netconf.yanglib-mdsal-writer - 9.0.1 | ietf-yang-library writer registered 2025-10-15T00:46:16,398 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=0}, cookie=0} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-default-operational#-1199572057], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 6.220 ms 2025-10-15T00:46:16,416 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 is waiting for dependencies [Initial app config LldpSpeakerConfig] 2025-10-15T00:46:16,450 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Loading properties from '(urn:opendaylight:params:xml:ns:yang:openflow:provider:config?revision=2016-05-10)openflow-provider-config' YANG file 2025-10-15T00:46:16,459 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | rpc-requests-quota configuration property was changed to '20000' 2025-10-15T00:46:16,459 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | global-notification-quota configuration property was changed to '64000' 2025-10-15T00:46:16,459 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | switch-features-mandatory configuration property was changed to 'false' 2025-10-15T00:46:16,459 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | enable-flow-removed-notification configuration property was changed to 'true' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-statistics-rpc-enabled configuration property was changed to 'false' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | barrier-count-limit configuration property was changed to '25600' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | barrier-interval-timeout-limit configuration property was changed to '500' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | echo-reply-timeout configuration property was changed to '2000' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-statistics-polling-on configuration property was changed to 'true' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-table-statistics-polling-on configuration property was changed to 'true' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-flow-statistics-polling-on configuration property was changed to 'true' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-group-statistics-polling-on configuration property was changed to 'true' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-meter-statistics-polling-on configuration property was changed to 'true' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-port-statistics-polling-on configuration property was changed to 'true' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-queue-statistics-polling-on configuration property was changed to 'true' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | skip-table-features configuration property was changed to 'true' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | basic-timer-delay configuration property was changed to '3000' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | maximum-timer-delay configuration property was changed to '900000' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | use-single-layer-serialization configuration property was changed to 'true' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | thread-pool-min-threads configuration property was changed to '1' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | thread-pool-max-threads configuration property was changed to '32000' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | thread-pool-timeout configuration property was changed to '60' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | device-connection-rate-limit-per-min configuration property was changed to '0' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | device-connection-hold-time-in-seconds configuration property was changed to '0' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | device-datastore-removal-delay configuration property was changed to '500' 2025-10-15T00:46:16,460 | INFO | Blueprint Extender: 2 | OSGiConfigurationServiceFactory | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Loading configuration from 'org.opendaylight.openflowplugin' configuration file 2025-10-15T00:46:16,466 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | felix.fileinstall.filename configuration property was changed to 'file:/tmp/karaf-0.23.0/etc/org.opendaylight.openflowplugin.cfg' 2025-10-15T00:46:16,466 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | service.pid configuration property was changed to 'org.opendaylight.openflowplugin' 2025-10-15T00:46:16,467 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [Initial app config TopologyLldpDiscoveryConfig] 2025-10-15T00:46:16,468 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.20.1 has been started 2025-10-15T00:46:16,468 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-15T00:46:16,469 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-2-shard-default-operational status sync done true 2025-10-15T00:46:16,470 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.openflowplugin.impl_0.20.1 [309] was successfully created 2025-10-15T00:46:16,472 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService)] 2025-10-15T00:46:16,507 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [Initial app config ForwardingRulesManagerConfig] 2025-10-15T00:46:16,541 | INFO | Blueprint Extender: 1 | LLDPSpeaker | 300 - org.opendaylight.openflowplugin.applications.lldp-speaker - 0.20.1 | LLDPSpeaker started, it will send LLDP frames each 5 seconds 2025-10-15T00:46:16,550 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | org.opendaylight.openflowplugin.applications.topology.lldp.LLDPLinkAger@2716557f was registered as configuration listener to OpenFlowPlugin configuration service 2025-10-15T00:46:16,574 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | OSGiSwitchConnectionProviders | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | MD-SAL configuration-based SwitchConnectionProviders started 2025-10-15T00:46:16,575 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.openflowplugin.srm-shell/0.20.1 2025-10-15T00:46:16,595 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | org.opendaylight.openflowplugin.applications.frm.impl.ForwardingRulesManagerImpl@292cf648 was registered as configuration listener to OpenFlowPlugin configuration service 2025-10-15T00:46:16,599 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | OSGiPasswordServiceConfigBootstrap | 170 - org.opendaylight.aaa.password-service-impl - 0.21.2 | Listening for password service configuration 2025-10-15T00:46:16,605 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | DeviceOwnershipService started 2025-10-15T00:46:16,607 | INFO | Blueprint Extender: 2 | LLDPActivator | 303 - org.opendaylight.openflowplugin.applications.topology-lldp-discovery - 0.20.1 | Starting LLDPActivator with lldpSecureKey: aa9251f8-c7c0-4322-b8d6-c3a84593bda3 2025-10-15T00:46:16,628 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | DefaultConfigPusher | 301 - org.opendaylight.openflowplugin.applications.of-switch-config-pusher - 0.20.1 | DefaultConfigPusher has started. 2025-10-15T00:46:16,627 | INFO | Blueprint Extender: 2 | LLDPActivator | 303 - org.opendaylight.openflowplugin.applications.topology-lldp-discovery - 0.20.1 | LLDPDiscoveryListener started. 2025-10-15T00:46:16,632 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 has been started 2025-10-15T00:46:16,636 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery_0.20.1 [303] was successfully created 2025-10-15T00:46:16,639 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-2-shard-default-config status sync done true 2025-10-15T00:46:16,647 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth)] 2025-10-15T00:46:16,648 | ERROR | opendaylight-cluster-data-notification-dispatcher-51 | H2Store | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.2 | bundle org.opendaylight.aaa.idm-store-h2:0.21.2 (167)[org.opendaylight.aaa.datastore.h2.H2Store(103)] : Constructor argument 0 in class class org.opendaylight.aaa.datastore.h2.H2Store has unsupported type org.opendaylight.aaa.datastore.h2.ConnectionProvider 2025-10-15T00:46:16,654 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | FlowCapableTopologyProvider | 304 - org.opendaylight.openflowplugin.applications.topology-manager - 0.20.1 | Topology Manager service started. 2025-10-15T00:46:16,658 | INFO | CommitFutures-0 | OSGiEncryptionServiceConfigurator | 165 - org.opendaylight.aaa.encrypt-service-impl - 0.21.2 | Configuration update succeeded 2025-10-15T00:46:16,665 | INFO | opendaylight-cluster-data-notification-dispatcher-51 | DefaultPasswordHashService | 170 - org.opendaylight.aaa.password-service-impl - 0.21.2 | DefaultPasswordHashService will utilize default iteration count=20000 2025-10-15T00:46:16,666 | INFO | opendaylight-cluster-data-notification-dispatcher-51 | DefaultPasswordHashService | 170 - org.opendaylight.aaa.password-service-impl - 0.21.2 | DefaultPasswordHashService will utilize default algorithm=SHA-512 2025-10-15T00:46:16,666 | INFO | opendaylight-cluster-data-notification-dispatcher-51 | DefaultPasswordHashService | 170 - org.opendaylight.aaa.password-service-impl - 0.21.2 | DefaultPasswordHashService will not utilize a private salt, since none was configured 2025-10-15T00:46:16,670 | INFO | Blueprint Extender: 1 | NodeConnectorInventoryEventTranslator | 300 - org.opendaylight.openflowplugin.applications.lldp-speaker - 0.20.1 | NodeConnectorInventoryEventTranslator has started. 2025-10-15T00:46:16,673 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 has been started 2025-10-15T00:46:16,674 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.lldp-speaker_0.20.1 [300] was successfully created 2025-10-15T00:46:16,682 | INFO | opendaylight-cluster-data-notification-dispatcher-51 | H2Store | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.2 | H2 IDMStore activated 2025-10-15T00:46:16,684 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config ShiroConfiguration, (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth)] 2025-10-15T00:46:16,685 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config ShiroConfiguration] 2025-10-15T00:46:16,696 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [Initial app config DatastoreConfig, (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config ShiroConfiguration] 2025-10-15T00:46:16,737 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [Initial app config DatastoreConfig, (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager)] 2025-10-15T00:46:16,739 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager)] 2025-10-15T00:46:16,782 | INFO | Blueprint Extender: 3 | ForwardingRulesManagerImpl | 299 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.20.1 | ForwardingRulesManager has started successfully. 2025-10-15T00:46:16,785 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 has been started 2025-10-15T00:46:16,789 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager_0.20.1 [299] was successfully created 2025-10-15T00:46:16,812 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Checking presence of configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-default-impl}] 2025-10-15T00:46:16,815 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Checking presence of configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-legacy-impl}] 2025-10-15T00:46:16,842 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | OSGiClusterAdmin | 193 - org.opendaylight.controller.sal-cluster-admin-impl - 11.0.2 | Cluster Admin services started 2025-10-15T00:46:16,842 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Distributed Datastore type CONFIGURATION started 2025-10-15T00:46:16,870 | INFO | CommitFutures-1 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-default-impl}] populated 2025-10-15T00:46:16,911 | INFO | CommitFutures-1 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-legacy-impl}] populated 2025-10-15T00:46:16,918 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | OSGiSwitchConnectionProviders | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Starting instance of type 'openflow-switch-connection-provider-default-impl' 2025-10-15T00:46:16,967 | INFO | opendaylight-cluster-data-notification-dispatcher-51 | AAAEncryptionServiceImpl | 165 - org.opendaylight.aaa.encrypt-service-impl - 0.21.2 | AAAEncryptionService activated 2025-10-15T00:46:16,968 | INFO | opendaylight-cluster-data-notification-dispatcher-51 | OSGiEncryptionServiceConfigurator | 165 - org.opendaylight.aaa.encrypt-service-impl - 0.21.2 | Encryption Service enabled 2025-10-15T00:46:17,029 | INFO | Blueprint Extender: 1 | LazyBindingList | 325 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.17 | Using lazy population for lists larger than 16 element(s) 2025-10-15T00:46:17,065 | INFO | Blueprint Extender: 1 | AaaCertMdsalProvider | 163 - org.opendaylight.aaa.cert - 0.21.2 | AaaCertMdsalProvider Initialized 2025-10-15T00:46:17,104 | INFO | Blueprint Extender: 1 | MdsalUtils | 163 - org.opendaylight.aaa.cert - 0.21.2 | initDatastore: data populated: CONFIGURATION, DataObjectIdentifier[ @ urn.opendaylight.yang.aaa.cert.mdsal.rev160321.KeyStores ], KeyStores{id=KeyStores:1} 2025-10-15T00:46:17,210 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-2-shard-topology-config status sync done true 2025-10-15T00:46:17,211 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-2-shard-inventory-config status sync done true 2025-10-15T00:46:17,211 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-2-shard-toaster-operational status sync done true 2025-10-15T00:46:17,211 | INFO | opendaylight-cluster-data-shard-dispatcher-47 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-2-shard-toaster-config status sync done true 2025-10-15T00:46:17,211 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-2-shard-topology-operational status sync done true 2025-10-15T00:46:17,211 | INFO | opendaylight-cluster-data-shard-dispatcher-46 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-2-shard-inventory-operational status sync done true 2025-10-15T00:46:17,323 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-inventory-operational#199778634], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent} 2025-10-15T00:46:17,324 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=0}, cookie=1} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-inventory-operational#199778634], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-15T00:46:17,325 | INFO | Blueprint Extender: 1 | ODLKeyTool | 163 - org.opendaylight.aaa.cert - 0.21.2 | ctl.jks is created 2025-10-15T00:46:17,327 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=0}, cookie=1} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-inventory-operational#199778634], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} in 3.121 ms 2025-10-15T00:46:17,370 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | OpenFlowPluginProvider started, waiting for onSystemBootReady() 2025-10-15T00:46:17,370 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Added connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@5486185b 2025-10-15T00:46:17,379 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | OnfExtensionProvider | 308 - org.opendaylight.openflowplugin.extension-onf - 0.20.1 | ONF Extension Provider started. 2025-10-15T00:46:17,385 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | OSGiSwitchConnectionProviders | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Starting instance of type 'openflow-switch-connection-provider-legacy-impl' 2025-10-15T00:46:17,387 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Added connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@57e9dc55 2025-10-15T00:46:17,428 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-15T00:46:17,429 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-15T00:46:17,436 | INFO | Blueprint Extender: 1 | CertificateManagerService | 163 - org.opendaylight.aaa.cert - 0.21.2 | Certificate Manager service has been initialized 2025-10-15T00:46:17,440 | INFO | Blueprint Extender: 1 | CertificateManagerService | 163 - org.opendaylight.aaa.cert - 0.21.2 | AaaCert Rpc Service has been initialized 2025-10-15T00:46:17,442 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 has been started 2025-10-15T00:46:17,443 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.aaa.cert_0.21.2 [163] was successfully created 2025-10-15T00:46:17,471 | INFO | Blueprint Extender: 3 | StoreBuilder | 162 - org.opendaylight.aaa.authn-api - 0.21.2 | Checking if default entries must be created in IDM store 2025-10-15T00:46:17,643 | INFO | Blueprint Extender: 3 | AbstractStore | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.2 | Table AAA_DOMAINS does not exist, creating it 2025-10-15T00:46:17,856 | INFO | Blueprint Extender: 3 | StoreBuilder | 162 - org.opendaylight.aaa.authn-api - 0.21.2 | Created default domain 2025-10-15T00:46:17,862 | INFO | Blueprint Extender: 3 | AbstractStore | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.2 | Table AAA_ROLES does not exist, creating it 2025-10-15T00:46:17,904 | INFO | Blueprint Extender: 3 | StoreBuilder | 162 - org.opendaylight.aaa.authn-api - 0.21.2 | Created 'admin' role 2025-10-15T00:46:17,917 | INFO | Blueprint Extender: 3 | StoreBuilder | 162 - org.opendaylight.aaa.authn-api - 0.21.2 | Created 'user' role 2025-10-15T00:46:18,060 | INFO | Blueprint Extender: 3 | AbstractStore | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.2 | Table AAA_USERS does not exist, creating it 2025-10-15T00:46:18,101 | INFO | Blueprint Extender: 3 | AbstractStore | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.2 | Table AAA_GRANTS does not exist, creating it 2025-10-15T00:46:18,195 | INFO | Blueprint Extender: 3 | AAAShiroProvider | 172 - org.opendaylight.aaa.shiro - 0.21.2 | AAAShiroProvider Session Initiated 2025-10-15T00:46:18,323 | INFO | Blueprint Extender: 3 | IniSecurityManagerFactory | 171 - org.opendaylight.aaa.repackaged-shiro - 0.21.2 | Realms have been explicitly set on the SecurityManager instance - auto-setting of realms will not occur. 2025-10-15T00:46:18,352 | ERROR | Blueprint Extender: 3 | MdsalRestconfServer | 279 - org.opendaylight.netconf.restconf-server-mdsal - 9.0.1 | bundle org.opendaylight.netconf.restconf-server-mdsal:9.0.1 (279)[org.opendaylight.restconf.server.mdsal.MdsalRestconfServer(78)] : Constructor argument 5 in class class org.opendaylight.restconf.server.mdsal.MdsalRestconfServer has unsupported type [Lorg.opendaylight.restconf.server.spi.RpcImplementation; 2025-10-15T00:46:18,437 | INFO | Blueprint Extender: 3 | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.opendaylight.netconf.restconf-server-jaxrs_9.0.1 [278]] 2025-10-15T00:46:18,438 | INFO | paxweb-config-3-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-11,contextPath='/rests'} 2025-10-15T00:46:18,438 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-9,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=320, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}}", size=2} 2025-10-15T00:46:18,438 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-11,contextPath='/rests'} 2025-10-15T00:46:18,439 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-9,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=320, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@35f5667b{/rests,null,STOPPED} 2025-10-15T00:46:18,440 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@35f5667b{/rests,null,STOPPED} 2025-10-15T00:46:18,443 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-12,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-9,RESTCONF,/rests}]} 2025-10-15T00:46:18,444 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-12,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-9,RESTCONF,/rests}]}", size=2} 2025-10-15T00:46:18,445 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2025-10-15T00:46:18,445 | INFO | Blueprint Extender: 3 | WhiteboardWebServer | 176 - org.opendaylight.aaa.web.osgi-impl - 0.21.2 | Bundle org.opendaylight.netconf.restconf-server-jaxrs_9.0.1 [278] registered context path /rests with 4 service(s) 2025-10-15T00:46:18,446 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/rests" with default Osgi Context OsgiContextModel{WB,id=OCM-9,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=320, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}} 2025-10-15T00:46:18,447 | INFO | Blueprint Extender: 3 | WhiteboardWebServer | 176 - org.opendaylight.aaa.web.osgi-impl - 0.21.2 | Bundle org.opendaylight.netconf.restconf-server-jaxrs_9.0.1 [278] registered context path /.well-known with 3 service(s) 2025-10-15T00:46:18,448 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Initializing CustomFilterAdapter 2025-10-15T00:46:18,448 | INFO | Blueprint Extender: 3 | YangLibraryWriterSingleton | 291 - org.opendaylight.netconf.yanglib-mdsal-writer - 9.0.1 | Binding URL provider org.opendaylight.restconf.server.jaxrs.JaxRsYangLibrary@700165d9 2025-10-15T00:46:18,449 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Injecting a new filter chain with 0 Filters: 2025-10-15T00:46:18,449 | INFO | paxweb-config-3-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@35f5667b{/rests,null,AVAILABLE} 2025-10-15T00:46:18,449 | INFO | paxweb-config-3-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-9,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=320, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}}} as OSGi service for "/rests" context path 2025-10-15T00:46:18,450 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-15T00:46:18,450 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-13,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-9,RESTCONF,/rests}]} 2025-10-15T00:46:18,450 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-13,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-9,RESTCONF,/rests}]}", size=2} 2025-10-15T00:46:18,451 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2025-10-15T00:46:18,451 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-15T00:46:18,451 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RESTCONF,/rests}]} 2025-10-15T00:46:18,451 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RESTCONF,/rests}]}", size=1} 2025-10-15T00:46:18,451 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RESTCONF,/rests}]} 2025-10-15T00:46:18,451 | INFO | paxweb-config-3-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-20,contextPath='/.well-known'} 2025-10-15T00:46:18,451 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-15,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=324, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}}", size=2} 2025-10-15T00:46:18,451 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-20,contextPath='/.well-known'} 2025-10-15T00:46:18,452 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-15,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=324, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@4b6685ae{/.well-known,null,STOPPED} 2025-10-15T00:46:18,453 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@4b6685ae{/.well-known,null,STOPPED} 2025-10-15T00:46:18,453 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-18,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-15,WellKnownURIs,/.well-known}]} 2025-10-15T00:46:18,453 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-18,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-15,WellKnownURIs,/.well-known}]}", size=2} 2025-10-15T00:46:18,453 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2025-10-15T00:46:18,453 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /.well-known 2025-10-15T00:46:18,453 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/.well-known" with default Osgi Context OsgiContextModel{WB,id=OCM-15,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=324, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}} 2025-10-15T00:46:18,454 | INFO | paxweb-config-3-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@4b6685ae{/.well-known,null,AVAILABLE} 2025-10-15T00:46:18,454 | INFO | paxweb-config-3-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-15,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=324, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}}} as OSGi service for "/.well-known" context path 2025-10-15T00:46:18,454 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-15T00:46:18,454 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-19,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-15,WellKnownURIs,/.well-known}]} 2025-10-15T00:46:18,454 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-19,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-15,WellKnownURIs,/.well-known}]}", size=1} 2025-10-15T00:46:18,454 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-19,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-15,WellKnownURIs,/.well-known}]} 2025-10-15T00:46:18,485 | INFO | Blueprint Extender: 3 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv4AddressNoZone 2025-10-15T00:46:18,485 | INFO | Blueprint Extender: 3 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv4Prefix 2025-10-15T00:46:18,486 | INFO | Blueprint Extender: 3 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv6AddressNoZone 2025-10-15T00:46:18,486 | INFO | Blueprint Extender: 3 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv6Prefix 2025-10-15T00:46:18,511 | INFO | Blueprint Extender: 3 | RestconfTransportChannelListener | 276 - org.opendaylight.netconf.restconf-server - 9.0.1 | Initialized with service class org.opendaylight.restconf.server.mdsal.MdsalRestconfServer 2025-10-15T00:46:18,511 | INFO | Blueprint Extender: 3 | RestconfTransportChannelListener | 276 - org.opendaylight.netconf.restconf-server - 9.0.1 | Initialized with base path: /restconf, default encoding: JSON, default pretty print: false 2025-10-15T00:46:18,556 | INFO | Blueprint Extender: 3 | OSGiNorthbound | 275 - org.opendaylight.netconf.restconf-nb - 9.0.1 | Global RESTCONF northbound pools started 2025-10-15T00:46:18,559 | INFO | paxweb-config-3-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-24,contextPath='/auth'} 2025-10-15T00:46:18,559 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-22,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=330, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}}", size=2} 2025-10-15T00:46:18,559 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-24,contextPath='/auth'} 2025-10-15T00:46:18,560 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-22,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=330, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@358edfb0{/auth,null,STOPPED} 2025-10-15T00:46:18,561 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@358edfb0{/auth,null,STOPPED} 2025-10-15T00:46:18,561 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-25,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-22,RealmManagement,/auth}]} 2025-10-15T00:46:18,561 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-25,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-22,RealmManagement,/auth}]}", size=2} 2025-10-15T00:46:18,561 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2025-10-15T00:46:18,561 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/auth" with default Osgi Context OsgiContextModel{WB,id=OCM-22,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=330, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}} 2025-10-15T00:46:18,562 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Initializing CustomFilterAdapter 2025-10-15T00:46:18,562 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Injecting a new filter chain with 0 Filters: 2025-10-15T00:46:18,562 | INFO | paxweb-config-3-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@358edfb0{/auth,null,AVAILABLE} 2025-10-15T00:46:18,562 | INFO | paxweb-config-3-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-22,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=330, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}}} as OSGi service for "/auth" context path 2025-10-15T00:46:18,562 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2025-10-15T00:46:18,562 | INFO | Blueprint Extender: 3 | WhiteboardWebServer | 176 - org.opendaylight.aaa.web.osgi-impl - 0.21.2 | Bundle org.opendaylight.aaa.shiro_0.21.2 [172] registered context path /auth with 4 service(s) 2025-10-15T00:46:18,562 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /.well-known 2025-10-15T00:46:18,563 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-15T00:46:18,563 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-26,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*, /moon/*],contexts=[{WB,OCM-22,RealmManagement,/auth}]} 2025-10-15T00:46:18,563 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-26,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*, /moon/*],contexts=[{WB,OCM-22,RealmManagement,/auth}]}", size=2} 2025-10-15T00:46:18,563 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2025-10-15T00:46:18,563 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2025-10-15T00:46:18,563 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /.well-known 2025-10-15T00:46:18,563 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 has been started 2025-10-15T00:46:18,563 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-15T00:46:18,563 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-28,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-22,RealmManagement,/auth}]} 2025-10-15T00:46:18,563 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.aaa.shiro_0.21.2 [172] was successfully created 2025-10-15T00:46:18,563 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-28,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-22,RealmManagement,/auth}]}", size=1} 2025-10-15T00:46:18,563 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-28,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-22,RealmManagement,/auth}]} 2025-10-15T00:46:19,188 | INFO | SystemReadyService-0 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.7 | checkBundleDiagInfos: Elapsed time 32s, remaining time 267s, diag: Active {INSTALLED=0, RESOLVED=10, UNKNOWN=0, GRACE_PERIOD=0, WAITING=0, STARTING=0, ACTIVE=397, STOPPING=0, FAILURE=0} 2025-10-15T00:46:19,189 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.7 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 2025-10-15T00:46:19,189 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.7 | Now notifying all its registered SystemReadyListeners... 2025-10-15T00:46:19,189 | INFO | SystemReadyService-0 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | onSystemBootReady() received, starting the switch connections 2025-10-15T00:46:19,333 | INFO | epollEventLoopGroup-2-1 | TcpServerFacade | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Switch listener started and ready to accept incoming TCP/TLS connections on /[0:0:0:0:0:0:0:0]:6653 2025-10-15T00:46:19,334 | INFO | epollEventLoopGroup-4-1 | TcpServerFacade | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Switch listener started and ready to accept incoming TCP/TLS connections on /[0:0:0:0:0:0:0:0]:6633 2025-10-15T00:46:19,334 | INFO | epollEventLoopGroup-2-1 | SwitchConnectionProviderImpl | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Started TCP connection on /[0:0:0:0:0:0:0:0]:6653 2025-10-15T00:46:19,334 | INFO | epollEventLoopGroup-4-1 | SwitchConnectionProviderImpl | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Started TCP connection on /[0:0:0:0:0:0:0:0]:6633 2025-10-15T00:46:19,335 | INFO | epollEventLoopGroup-4-1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@57e9dc55 started 2025-10-15T00:46:19,335 | INFO | epollEventLoopGroup-2-1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@5486185b started 2025-10-15T00:46:19,335 | INFO | epollEventLoopGroup-2-1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | All switchConnectionProviders are up and running (2). 2025-10-15T00:48:56,722 | INFO | sshd-SshServer[40a42786](port=8101)-nio2-thread-1 | OpenSSHKeyPairProvider | 121 - org.apache.karaf.shell.ssh - 4.4.8 | Creating ssh server private key at /tmp/karaf-0.23.0/etc/host.key 2025-10-15T00:48:56,724 | INFO | sshd-SshServer[40a42786](port=8101)-nio2-thread-1 | OpenSSHKeyPairGenerator | 121 - org.apache.karaf.shell.ssh - 4.4.8 | generateKeyPair(RSA) generating host key - size=2048 2025-10-15T00:48:57,198 | INFO | sshd-SshServer[40a42786](port=8101)-nio2-thread-2 | ServerSessionImpl | 125 - org.apache.sshd.osgi - 2.15.0 | Session karaf@/10.30.170.242:59186 authenticated 2025-10-15T00:48:58,931 | INFO | pipe-log:log "ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/010__Cluster_Reconcilliation_Multi_DPN.robot" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/010__Cluster_Reconcilliation_Multi_DPN.robot 2025-10-15T00:48:59,548 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Check Shards Status And Initialize Variables" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Check Shards Status And Initialize Variables 2025-10-15T00:49:01,472 | INFO | qtp1808674182-487 | JaxRsRestconf | 278 - org.opendaylight.netconf.restconf-server-jaxrs - 9.0.1 | RESTCONF data-missing condition is reported as HTTP status 409 (RFC8040) 2025-10-15T00:49:01,475 | INFO | qtp1808674182-487 | JaxRsRestconf | 278 - org.opendaylight.netconf.restconf-server-jaxrs - 9.0.1 | RESTCONF data-missing condition is reported as HTTP status 409 (RFC8040) 2025-10-15T00:49:01,887 | INFO | qtp1808674182-487 | AuthenticationManager | 174 - org.opendaylight.aaa.tokenauthrealm - 0.21.2 | Authentication is now enabled 2025-10-15T00:49:01,888 | INFO | qtp1808674182-487 | AuthenticationManager | 174 - org.opendaylight.aaa.tokenauthrealm - 0.21.2 | Authentication Manager activated 2025-10-15T00:49:01,943 | INFO | qtp1808674182-487 | ApiPathParser | 273 - org.opendaylight.netconf.restconf-api - 9.0.1 | Consecutive slashes in REST URLs will be rejected 2025-10-15T00:49:07,165 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Get Inventory Follower and Leader Before Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Get Inventory Follower and Leader Before Cluster Restart 2025-10-15T00:49:08,288 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Follower Node1 2025-10-15T00:49:11,242 | INFO | epollEventLoopGroup-5-1 | SystemNotificationsListenerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | ConnectionEvent: Connection closed by device, Device:/10.30.170.212:41344, NodeId:null 2025-10-15T00:49:11,303 | INFO | epollEventLoopGroup-5-2 | ConnectionAdapterImpl | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Hello received 2025-10-15T00:49:11,310 | INFO | epollEventLoopGroup-5-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Device openflow:1 connected. 2025-10-15T00:49:11,310 | INFO | epollEventLoopGroup-5-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | No context chain found for device: openflow:1, creating new. 2025-10-15T00:49:11,311 | INFO | epollEventLoopGroup-5-2 | DeviceManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | ConnectionEvent: Device connected to controller, Device:/10.30.170.212:41354, NodeId:Uri{value=openflow:1} 2025-10-15T00:49:11,328 | INFO | epollEventLoopGroup-5-2 | RoleContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Started timer for setting SLAVE role on device openflow:1 if no role will be set in 20s. 2025-10-15T00:49:11,383 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-15T00:49:11,521 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2025-10-15T00:49:11,542 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Add Bulk Flow From Follower" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Add Bulk Flow From Follower 2025-10-15T00:49:11,561 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-15T00:49:11,722 | INFO | qtp1808674182-485 | StaticConfiguration | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding-over-DOM codec shortcuts are enabled 2025-10-15T00:49:11,731 | INFO | qtp1808674182-485 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Using Ping Pong Flow Tester Impl 2025-10-15T00:49:11,732 | INFO | qtp1808674182-485 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Using Transaction Chain Flow Writer Impl 2025-10-15T00:49:11,735 | INFO | ForkJoinPool-10-worker-1 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Number of Txn for dpId: openflow:1 is: 1 2025-10-15T00:49:11,736 | INFO | ForkJoinPool-10-worker-1 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Creating new txChain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@1029f228 for dpid: openflow:1 2025-10-15T00:49:11,760 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2025-10-15T00:49:11,760 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting DeviceContextImpl[NEW] service for node openflow:1 2025-10-15T00:49:11,770 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting RpcContextImpl[NEW] service for node openflow:1 2025-10-15T00:49:11,797 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting StatisticsContextImpl[NEW] service for node openflow:1 2025-10-15T00:49:11,797 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting RoleContextImpl[NEW] service for node openflow:1 2025-10-15T00:49:11,799 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | SalRoleRpc | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | SetRole called with input:SetRoleInput{controllerRole=BECOMEMASTER, node=NodeRef{value=DataObjectIdentifier[ @ urn.opendaylight.inventory.rev130819.Nodes ... nodes.Node[NodeKey{id=Uri{value=openflow:1}}] ]}} 2025-10-15T00:49:11,799 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | SalRoleRpc | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Requesting state change to BECOMEMASTER 2025-10-15T00:49:11,799 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | SalRoleRpc | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | RoleChangeTask called on device:openflow:1 OFPRole:BECOMEMASTER 2025-10-15T00:49:11,799 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RoleService | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | getGenerationIdFromDevice called for device: openflow:1 2025-10-15T00:49:11,803 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Started clustering services for node openflow:1 2025-10-15T00:49:11,809 | INFO | epollEventLoopGroup-5-2 | RoleService | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | submitRoleChange called for device:Uri{value=openflow:1}, role:BECOMEMASTER 2025-10-15T00:49:11,811 | INFO | epollEventLoopGroup-5-2 | RoleService | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | submitRoleChange onSuccess for device:Uri{value=openflow:1}, role:BECOMEMASTER 2025-10-15T00:49:11,819 | INFO | ofppool-0 | FlowNodeReconciliationImpl | 299 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.20.1 | Triggering reconciliation for device NodeKey{id=Uri{value=openflow:1}} 2025-10-15T00:49:11,840 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config#-826035993], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent} 2025-10-15T00:49:11,840 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, cookie=1} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config#-826035993], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-15T00:49:11,841 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, cookie=1} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config#-826035993], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} in 348.3 μs 2025-10-15T00:49:11,846 | INFO | pool-20-thread-1 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Device openflow:1 connection is enabled by reconciliation framework. 2025-10-15T00:49:11,864 | INFO | pool-20-thread-1 | DeviceInitializationUtil | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | IP address of the node openflow:1 is: IpAddress{ipv4Address=Ipv4Address{value=10.30.170.212}} 2025-10-15T00:49:11,865 | INFO | pool-20-thread-1 | DeviceInitializationUtil | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Port number of the node openflow:1 is: 41354 2025-10-15T00:49:11,979 | INFO | pool-20-thread-1 | OF13DeviceInitializer | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Static node openflow:1 info: OFPMPMETERFEATURES collected 2025-10-15T00:49:11,989 | INFO | epollEventLoopGroup-5-2 | OF13DeviceInitializer | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Static node openflow:1 info: OFPMPGROUPFEATURES collected 2025-10-15T00:49:12,052 | INFO | epollEventLoopGroup-5-2 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.MacAddress 2025-10-15T00:49:12,052 | INFO | epollEventLoopGroup-5-2 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.PhysAddress 2025-10-15T00:49:12,052 | INFO | epollEventLoopGroup-5-2 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.HexString 2025-10-15T00:49:12,053 | INFO | epollEventLoopGroup-5-2 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.DottedQuad 2025-10-15T00:49:12,053 | INFO | epollEventLoopGroup-5-2 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.Uuid 2025-10-15T00:49:12,063 | INFO | pool-20-thread-1 | OF13DeviceInitializer | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Static node openflow:1 info: OFPMPPORTDESC collected 2025-10-15T00:49:12,082 | WARN | node-cleaner-0 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Seems like device is still owned by other controller instance. Skip deleting openflow:1 node from operational datastore. 2025-10-15T00:49:12,083 | INFO | pool-20-thread-1 | OF13DeviceInitializer | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Static node openflow:1 successfully finished collecting 2025-10-15T00:49:12,166 | INFO | pool-20-thread-1 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Device openflow:1 is able to work as master 2025-10-15T00:49:12,167 | INFO | pool-20-thread-1 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Role MASTER was granted to device openflow:1 2025-10-15T00:49:12,167 | INFO | pool-20-thread-1 | DeviceManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Publishing node added notification for Uri{value=openflow:1} 2025-10-15T00:49:12,180 | INFO | pool-20-thread-1 | StatisticsContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting statistics gathering for node openflow:1 2025-10-15T00:49:12,201 | INFO | opendaylight-cluster-data-notification-dispatcher-52 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Clearing the device connection timer for the device 1 2025-10-15T00:49:12,211 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | LazyBindingMap | 325 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.17 | Using lazy population for maps larger than 1 element(s) 2025-10-15T00:49:12,842 | INFO | ForkJoinPool-10-worker-1 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Completed FlowHandlerTask thread for dpid: openflow:1 2025-10-15T00:49:13,292 | INFO | CommitFutures-3 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Completed all flows installation for: dpid: openflow:1 in 1560284516ns 2025-10-15T00:49:13,294 | INFO | CommitFutures-3 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Transaction chain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@1029f228 closed successfully. 2025-10-15T00:49:14,171 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Get Bulk Flows and Verify In Inventory Leader" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Get Bulk Flows and Verify In Inventory Leader 2025-10-15T00:49:29,458 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Before Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Before Cluster Restart 2025-10-15T00:49:30,040 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node1 and Exit" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node1 and Exit 2025-10-15T00:49:30,301 | INFO | epollEventLoopGroup-5-2 | SystemNotificationsListenerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | ConnectionEvent: Connection closed by device, Device:/10.30.170.212:41354, NodeId:openflow:1 2025-10-15T00:49:30,301 | INFO | epollEventLoopGroup-5-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Device openflow:1 disconnected. 2025-10-15T00:49:30,302 | INFO | epollEventLoopGroup-5-2 | ReconciliationManagerImpl | 302 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.20.1 | Stopping reconciliation for node Uri{value=openflow:1} 2025-10-15T00:49:30,309 | INFO | epollEventLoopGroup-5-2 | DeviceManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Publishing node removed notification for Uri{value=openflow:1} 2025-10-15T00:49:30,311 | INFO | epollEventLoopGroup-5-2 | ReconciliationManagerImpl | 302 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.20.1 | Stopping reconciliation for node Uri{value=openflow:1} 2025-10-15T00:49:30,311 | INFO | epollEventLoopGroup-5-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Role SLAVE was granted to device openflow:1 2025-10-15T00:49:30,311 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping RoleContextImpl[RUNNING] service for node openflow:1 2025-10-15T00:49:30,312 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping StatisticsContextImpl[RUNNING] service for node openflow:1 2025-10-15T00:49:30,312 | INFO | epollEventLoopGroup-5-2 | StatisticsContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping running statistics gathering for node openflow:1 2025-10-15T00:49:30,313 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping RpcContextImpl[RUNNING] service for node openflow:1 2025-10-15T00:49:30,313 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping DeviceContextImpl[RUNNING] service for node openflow:1 2025-10-15T00:49:30,508 | INFO | ofppool-0 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Closed clustering services for node openflow:1 2025-10-15T00:49:30,508 | INFO | epollEventLoopGroup-5-2 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Closed clustering services registration for node openflow:1 2025-10-15T00:49:30,509 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Terminating DeviceContextImpl[TERMINATED] service for node openflow:1 2025-10-15T00:49:30,509 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Terminating RpcContextImpl[TERMINATED] service for node openflow:1 2025-10-15T00:49:30,509 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Terminating StatisticsContextImpl[TERMINATED] service for node openflow:1 2025-10-15T00:49:30,509 | INFO | epollEventLoopGroup-5-2 | StatisticsContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping running statistics gathering for node openflow:1 2025-10-15T00:49:30,510 | INFO | epollEventLoopGroup-5-2 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Terminating RoleContextImpl[TERMINATED] service for node openflow:1 2025-10-15T00:49:30,507 | WARN | ofppool-1 | DeviceContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Error processing port status message for port 1 on device 1 org.opendaylight.mdsal.binding.api.TransactionChainClosedException: Cannot write into transaction. at org.opendaylight.openflowplugin.common.txchain.TransactionChainManager.addDeleteOperationToTxChain(TransactionChainManager.java:228) ~[?:?] at org.opendaylight.openflowplugin.impl.device.DeviceContextImpl.addDeleteToTxChain(DeviceContextImpl.java:261) ~[?:?] at org.opendaylight.openflowplugin.impl.device.DeviceContextImpl.lambda$writePortStatusMessage$1(DeviceContextImpl.java:381) ~[?:?] at com.google.common.collect.ImmutableList.forEach(ImmutableList.java:421) ~[?:?] at org.opendaylight.openflowplugin.impl.device.DeviceManagerImpl.lambda$new$0(DeviceManagerImpl.java:94) ~[?:?] at org.opendaylight.yangtools.util.concurrent.AbstractQueuedNotificationManager.executeBatch(AbstractQueuedNotificationManager.java:88) ~[?:?] at org.opendaylight.yangtools.util.concurrent.AbstractBatchingExecutor$DispatcherTask.invokeWorker(AbstractBatchingExecutor.java:305) ~[?:?] at org.opendaylight.yangtools.util.concurrent.AbstractBatchingExecutor$DispatcherTask.run(AbstractBatchingExecutor.java:292) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?] at java.lang.Thread.run(Thread.java:1583) [?:?] 2025-10-15T00:49:30,514 | ERROR | ofppool-1 | AbstractBatchingExecutor | 358 - org.opendaylight.yangtools.util - 14.0.17 | port-status-queue: Error invoking worker 1 with [org.opendaylight.openflowplugin.impl.device.DeviceContextImpl$$Lambda/0x00000007c18bcf90@69b5b462] java.lang.NullPointerException: Cannot invoke "org.opendaylight.openflowplugin.common.txchain.TransactionChainManager.releaseWriteTransactionLock()" because "this.transactionChainManager" is null at org.opendaylight.openflowplugin.impl.device.DeviceContextImpl.releaseWriteTransactionLock(DeviceContextImpl.java:618) ~[?:?] at org.opendaylight.openflowplugin.impl.device.DeviceContextImpl.lambda$writePortStatusMessage$1(DeviceContextImpl.java:388) ~[?:?] at com.google.common.collect.ImmutableList.forEach(ImmutableList.java:421) ~[?:?] at org.opendaylight.openflowplugin.impl.device.DeviceManagerImpl.lambda$new$0(DeviceManagerImpl.java:94) ~[?:?] at org.opendaylight.yangtools.util.concurrent.AbstractQueuedNotificationManager.executeBatch(AbstractQueuedNotificationManager.java:88) ~[?:?] at org.opendaylight.yangtools.util.concurrent.AbstractBatchingExecutor$DispatcherTask.invokeWorker(AbstractBatchingExecutor.java:305) ~[?:?] at org.opendaylight.yangtools.util.concurrent.AbstractBatchingExecutor$DispatcherTask.run(AbstractBatchingExecutor.java:292) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?] at java.lang.Thread.run(Thread.java:1583) [?:?] 2025-10-15T00:49:30,541 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2025-10-15T00:49:30,781 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2025-10-15T00:49:31,288 | INFO | node-cleaner-0 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Try to remove device openflow:1 from operational DS 2025-10-15T00:49:31,931 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=19427, lastAppliedTerm=2, lastIndex=20008, lastTerm=2, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=581, mandatoryTrim=false] 2025-10-15T00:49:31,936 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Persising snapshot at EntryInfo[index=19427, term=2]/EntryInfo[index=20008, term=2] 2025-10-15T00:49:31,938 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 18949 and term: 2 2025-10-15T00:49:32,005 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: snapshot is durable as of 2025-10-15T00:49:31.937769750Z 2025-10-15T00:49:32,432 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Reconnect To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Reconnect To Follower Node1 2025-10-15T00:49:34,841 | INFO | epollEventLoopGroup-5-3 | SystemNotificationsListenerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | ConnectionEvent: Connection closed by device, Device:/10.30.170.212:60476, NodeId:null 2025-10-15T00:49:34,905 | INFO | epollEventLoopGroup-5-4 | ConnectionAdapterImpl | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Hello received 2025-10-15T00:49:34,912 | INFO | epollEventLoopGroup-5-4 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Device openflow:1 connected. 2025-10-15T00:49:34,912 | INFO | epollEventLoopGroup-5-4 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | No context chain found for device: openflow:1, creating new. 2025-10-15T00:49:34,912 | INFO | epollEventLoopGroup-5-4 | DeviceManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | ConnectionEvent: Device connected to controller, Device:/10.30.170.212:60486, NodeId:Uri{value=openflow:1} 2025-10-15T00:49:34,913 | INFO | epollEventLoopGroup-5-4 | RoleContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Started timer for setting SLAVE role on device openflow:1 if no role will be set in 20s. 2025-10-15T00:49:35,172 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Reconnected To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Reconnected To Follower Node1 2025-10-15T00:49:35,221 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2025-10-15T00:49:35,460 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting DeviceContextImpl[NEW] service for node openflow:1 2025-10-15T00:49:35,461 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2025-10-15T00:49:35,461 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting RpcContextImpl[NEW] service for node openflow:1 2025-10-15T00:49:35,465 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting StatisticsContextImpl[NEW] service for node openflow:1 2025-10-15T00:49:35,465 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting RoleContextImpl[NEW] service for node openflow:1 2025-10-15T00:49:35,465 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | SalRoleRpc | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | SetRole called with input:SetRoleInput{controllerRole=BECOMEMASTER, node=NodeRef{value=DataObjectIdentifier[ @ urn.opendaylight.inventory.rev130819.Nodes ... nodes.Node[NodeKey{id=Uri{value=openflow:1}}] ]}} 2025-10-15T00:49:35,465 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | SalRoleRpc | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Requesting state change to BECOMEMASTER 2025-10-15T00:49:35,465 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | SalRoleRpc | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | RoleChangeTask called on device:openflow:1 OFPRole:BECOMEMASTER 2025-10-15T00:49:35,466 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | RoleService | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | getGenerationIdFromDevice called for device: openflow:1 2025-10-15T00:49:35,466 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Started clustering services for node openflow:1 2025-10-15T00:49:35,467 | INFO | epollEventLoopGroup-5-4 | RoleService | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | submitRoleChange called for device:Uri{value=openflow:1}, role:BECOMEMASTER 2025-10-15T00:49:35,467 | INFO | epollEventLoopGroup-5-4 | RoleService | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | submitRoleChange onSuccess for device:Uri{value=openflow:1}, role:BECOMEMASTER 2025-10-15T00:49:35,471 | INFO | ofppool-1 | FlowNodeReconciliationImpl | 299 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.20.1 | Triggering reconciliation for device NodeKey{id=Uri{value=openflow:1}} 2025-10-15T00:49:35,739 | INFO | pool-20-thread-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Device openflow:1 connection is enabled by reconciliation framework. 2025-10-15T00:49:35,768 | INFO | epollEventLoopGroup-5-4 | DeviceInitializationUtil | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | IP address of the node openflow:1 is: IpAddress{ipv4Address=Ipv4Address{value=10.30.170.212}} 2025-10-15T00:49:35,769 | INFO | epollEventLoopGroup-5-4 | DeviceInitializationUtil | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Port number of the node openflow:1 is: 60486 2025-10-15T00:49:35,780 | INFO | epollEventLoopGroup-5-4 | OF13DeviceInitializer | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Static node openflow:1 info: OFPMPMETERFEATURES collected 2025-10-15T00:49:35,781 | INFO | epollEventLoopGroup-5-4 | OF13DeviceInitializer | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Static node openflow:1 info: OFPMPGROUPFEATURES collected 2025-10-15T00:49:35,789 | INFO | epollEventLoopGroup-5-4 | OF13DeviceInitializer | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Static node openflow:1 info: OFPMPPORTDESC collected 2025-10-15T00:49:35,789 | INFO | epollEventLoopGroup-5-4 | OF13DeviceInitializer | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Static node openflow:1 successfully finished collecting 2025-10-15T00:49:35,810 | INFO | opendaylight-cluster-data-notification-dispatcher-53 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Clearing the device connection timer for the device 1 2025-10-15T00:49:35,812 | INFO | pool-20-thread-2 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Device openflow:1 is able to work as master 2025-10-15T00:49:35,812 | INFO | pool-20-thread-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Role MASTER was granted to device openflow:1 2025-10-15T00:49:35,812 | INFO | pool-20-thread-2 | DeviceManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Publishing node added notification for Uri{value=openflow:1} 2025-10-15T00:49:35,812 | INFO | pool-20-thread-2 | StatisticsContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Starting statistics gathering for node openflow:1 2025-10-15T00:49:37,604 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node1 2025-10-15T00:49:37,821 | INFO | epollEventLoopGroup-5-4 | SystemNotificationsListenerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | ConnectionEvent: Connection closed by device, Device:/10.30.170.212:60486, NodeId:openflow:1 2025-10-15T00:49:37,821 | INFO | epollEventLoopGroup-5-4 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Device openflow:1 disconnected. 2025-10-15T00:49:37,822 | INFO | epollEventLoopGroup-5-4 | ReconciliationManagerImpl | 302 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.20.1 | Stopping reconciliation for node Uri{value=openflow:1} 2025-10-15T00:49:37,823 | INFO | epollEventLoopGroup-5-4 | DeviceManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Publishing node removed notification for Uri{value=openflow:1} 2025-10-15T00:49:37,824 | INFO | epollEventLoopGroup-5-4 | ReconciliationManagerImpl | 302 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.20.1 | Stopping reconciliation for node Uri{value=openflow:1} 2025-10-15T00:49:37,824 | INFO | epollEventLoopGroup-5-4 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Role SLAVE was granted to device openflow:1 2025-10-15T00:49:37,824 | INFO | epollEventLoopGroup-5-4 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping RoleContextImpl[RUNNING] service for node openflow:1 2025-10-15T00:49:37,824 | INFO | epollEventLoopGroup-5-4 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping StatisticsContextImpl[RUNNING] service for node openflow:1 2025-10-15T00:49:37,825 | INFO | epollEventLoopGroup-5-4 | StatisticsContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping running statistics gathering for node openflow:1 2025-10-15T00:49:37,825 | INFO | epollEventLoopGroup-5-4 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping RpcContextImpl[RUNNING] service for node openflow:1 2025-10-15T00:49:37,826 | INFO | epollEventLoopGroup-5-4 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping DeviceContextImpl[RUNNING] service for node openflow:1 2025-10-15T00:49:37,826 | INFO | epollEventLoopGroup-5-4 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Closed clustering services registration for node openflow:1 2025-10-15T00:49:37,826 | INFO | epollEventLoopGroup-5-4 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Terminating DeviceContextImpl[TERMINATED] service for node openflow:1 2025-10-15T00:49:37,826 | INFO | ofppool-1 | ContextChainImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Closed clustering services for node openflow:1 2025-10-15T00:49:37,827 | INFO | epollEventLoopGroup-5-4 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Terminating RpcContextImpl[TERMINATED] service for node openflow:1 2025-10-15T00:49:37,827 | INFO | epollEventLoopGroup-5-4 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Terminating StatisticsContextImpl[TERMINATED] service for node openflow:1 2025-10-15T00:49:37,827 | INFO | epollEventLoopGroup-5-4 | StatisticsContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Stopping running statistics gathering for node openflow:1 2025-10-15T00:49:37,827 | INFO | epollEventLoopGroup-5-4 | GuardedContextImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Terminating RoleContextImpl[TERMINATED] service for node openflow:1 2025-10-15T00:49:37,981 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2025-10-15T00:49:37,981 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2025-10-15T00:49:38,486 | INFO | node-cleaner-1 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Try to remove device openflow:1 from operational DS 2025-10-15T00:49:39,957 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Follower Node2 2025-10-15T00:49:42,695 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Connected To Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Connected To Follower Node2 2025-10-15T00:49:43,140 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-15T00:49:43,261 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-15T00:49:44,370 | INFO | opendaylight-cluster-data-notification-dispatcher-53 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Clearing the device connection timer for the device 1 2025-10-15T00:49:45,114 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node2 2025-10-15T00:49:45,661 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-15T00:49:45,901 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-15T00:49:46,411 | INFO | node-cleaner-0 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Try to remove device openflow:1 from operational DS 2025-10-15T00:49:47,548 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Inventory Leader" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Inventory Leader 2025-10-15T00:49:50,294 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Connected To Leader" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Connected To Leader 2025-10-15T00:49:50,581 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-15T00:49:50,701 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-15T00:49:51,968 | INFO | opendaylight-cluster-data-notification-dispatcher-62 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Clearing the device connection timer for the device 1 2025-10-15T00:49:52,750 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Inventory Leader" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Inventory Leader 2025-10-15T00:49:53,111 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-15T00:49:53,231 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-15T00:49:53,736 | INFO | node-cleaner-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Try to remove device openflow:1 from operational DS 2025-10-15T00:49:55,148 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Delete All Flows From Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Delete All Flows From Follower Node1 2025-10-15T00:49:55,311 | INFO | qtp1808674182-485 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Using Ping Pong Flow Tester Impl 2025-10-15T00:49:55,312 | INFO | qtp1808674182-485 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Using Transaction Chain Flow Writer Impl 2025-10-15T00:49:55,312 | INFO | ForkJoinPool-10-worker-1 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Number of Txn for dpId: openflow:1 is: 1 2025-10-15T00:49:55,312 | INFO | ForkJoinPool-10-worker-1 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Creating new txChain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@3e96185c for dpid: openflow:1 2025-10-15T00:49:55,345 | INFO | ForkJoinPool-10-worker-1 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Completed FlowHandlerTask thread for dpid: openflow:1 2025-10-15T00:49:55,390 | WARN | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.lang.IllegalArgumentException: newPosition > limit: (2742201 > 262199) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] 2025-10-15T00:49:55,440 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | newPosition > limit: (2742201 > 262199) 2025-10-15T00:49:55,450 | WARN | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (2742201 > 262199) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T00:49:55,451 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-43 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T00:50:55,350 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config#-826035993], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config#-826035993], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-15T00:50:55,353 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: refreshing backend for shard 1 2025-10-15T00:50:55,357 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config#-826035993], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent} 2025-10-15T00:50:55,358 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config#-826035993], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config#-826035993], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-15T00:50:55,370 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config#-826035993], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config#-826035993], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} in 11.94 ms 2025-10-15T00:51:36,459 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify No Flows In Inventory Leader" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify No Flows In Inventory Leader 2025-10-15T00:51:36,991 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (2742201 > 262199) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T00:51:36,992 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T00:51:36,993 | WARN | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (2742201 > 262199) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T00:51:36,993 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T00:51:37,509 | WARN | opendaylight-cluster-data-shard-dispatcher-47 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (2742201 > 262199) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T00:51:37,511 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T00:51:55,370 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config#-826035993], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config#-826035993], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-15T00:51:55,371 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-44 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: refreshing backend for shard 1 2025-10-15T00:51:55,377 | INFO | CommitFutures-5 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Completed all flows installation for: dpid: openflow:1 in 536989750943ns 2025-10-15T00:51:55,378 | ERROR | CommitFutures-5 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Error: TransactionCommitFailedException{message=canCommit encountered an unexpected failure, errorList=[RpcError [message=canCommit encountered an unexpected failure, severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ModifyTransactionRequest{target=member-2-datastore-config-fe-0-chn-8-txn-0-1, sequence=11, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#-1880156409], modifications=0, protocol=SIMPLE} timed out after 120.026270516 seconds. The backend for inventory is not available.]]} in Datastore write operation: dpid: openflow:1, begin tableId: 0, end tableId: 1, sourceIp: 10001 2025-10-15T00:51:55,378 | ERROR | CommitFutures-4 | FlowWriterTxChain | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Transaction chain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@3e96185c FAILED due to: org.opendaylight.mdsal.common.api.TransactionCommitFailedException: canCommit encountered an unexpected failure at org.opendaylight.mdsal.dom.spi.TransactionCommitFailedExceptionMapper.newWithCause(TransactionCommitFailedExceptionMapper.java:42) ~[bundleFile:14.0.18] at org.opendaylight.mdsal.dom.spi.TransactionCommitFailedExceptionMapper.newWithCause(TransactionCommitFailedExceptionMapper.java:18) ~[bundleFile:14.0.18] at org.opendaylight.yangtools.util.concurrent.ExceptionMapper.apply(ExceptionMapper.java:98) ~[bundleFile:14.0.17] at org.opendaylight.mdsal.dom.spi.TransactionCommitFailedExceptionMapper.apply(TransactionCommitFailedExceptionMapper.java:37) ~[bundleFile:14.0.18] at org.opendaylight.controller.cluster.databroker.ConcurrentDOMDataBroker.handleException(ConcurrentDOMDataBroker.java:189) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.ConcurrentDOMDataBroker$1.onFailure(ConcurrentDOMDataBroker.java:133) ~[bundleFile:?] at com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1117) ~[bundleFile:?] at com.google.common.util.concurrent.DirectExecutor.execute(DirectExecutor.java:30) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:1004) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:767) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:516) ~[bundleFile:?] at com.google.common.util.concurrent.SettableFuture.setException(SettableFuture.java:54) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractProxyTransaction.lambda$directCommit$4(AbstractProxyTransaction.java:510) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$handleReplayedModifyTransactionRequest$16(RemoteProxyTransaction.java:510) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:47) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:431) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:147) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:48) ~[bundleFile:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:86) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) ~[?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) ~[?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) ~[?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) ~[?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) ~[?:?] Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ModifyTransactionRequest{target=member-2-datastore-config-fe-0-chn-8-txn-0-1, sequence=11, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#-1880156409], modifications=0, protocol=SIMPLE} timed out after 120.026270516 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:432) ~[bundleFile:?] ... 26 more 2025-10-15T00:51:55,386 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:51:56,405 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:51:57,424 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:51:58,443 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:51:59,464 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:00,484 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:01,504 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:02,523 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:03,548 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:04,564 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:05,583 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:06,604 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:07,624 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:08,644 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:09,666 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:10,684 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:11,704 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:12,724 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:13,744 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:14,763 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:15,785 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:16,804 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:17,824 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:18,843 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:19,864 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:20,886 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:21,905 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:22,923 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:23,944 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:24,964 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:25,985 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:27,004 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:28,024 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:29,045 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:30,064 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:31,085 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:32,103 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:33,124 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:34,144 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:35,164 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:36,185 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:37,204 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:38,224 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:39,245 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:40,263 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:41,283 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:42,304 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:43,324 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:44,348 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:45,363 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:46,383 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:47,404 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:48,424 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:49,443 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:50,464 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:51,483 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:52,503 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:53,523 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:54,544 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:55,564 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:56,584 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:57,602 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:58,623 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:52:59,644 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:00,663 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:01,684 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:02,704 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:03,724 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:04,743 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:05,763 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:06,785 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:07,803 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:08,824 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:09,844 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:10,864 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:11,884 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:12,904 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:13,924 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:14,943 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:15,965 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:16,985 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:18,005 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:19,024 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:19,818 | INFO | sshd-SshServer[40a42786](port=8101)-nio2-thread-2 | ServerSessionImpl | 125 - org.apache.sshd.osgi - 2.15.0 | Session karaf@/10.30.170.242:41626 authenticated 2025-10-15T00:53:20,043 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:20,544 | INFO | pipe-log:log "ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/020__Cluster_HA_Data_Recovery_BulkFlow_2Node_Cluster.robot" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/020__Cluster_HA_Data_Recovery_BulkFlow_2Node_Cluster.robot 2025-10-15T00:53:20,989 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status and Initialize Variables" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status and Initialize Variables 2025-10-15T00:53:21,065 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:22,085 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:23,104 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:23,746 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Inventory Follower Before Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Inventory Follower Before Leader Restart 2025-10-15T00:53:24,124 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:25,143 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:26,164 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:27,186 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:28,203 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:29,223 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:30,244 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:31,265 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:32,284 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:33,304 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:34,199 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Shutdown Leader From Cluster Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Shutdown Leader From Cluster Node 2025-10-15T00:53:34,324 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:34,636 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Leader Shutdown" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Leader Shutdown 2025-10-15T00:53:35,348 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:36,364 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:37,388 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:38,406 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:39,425 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:40,444 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:41,465 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:42,484 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:43,506 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:44,525 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:45,544 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:46,565 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:47,584 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:48,603 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:49,629 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:50,645 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:51,664 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:52,684 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:53,703 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:54,724 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:55,747 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:56,764 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:57,784 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:58,806 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:53:59,826 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:00,846 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:01,864 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:02,885 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:03,905 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:04,926 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:05,944 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:06,963 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:07,984 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:09,003 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:10,024 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:11,044 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:12,065 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:13,083 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:14,104 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:15,123 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:16,144 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:17,164 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:18,184 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:19,203 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:20,223 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:21,243 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:22,263 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:23,283 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:24,305 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:25,324 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:26,344 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:27,364 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:28,384 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:29,404 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:30,424 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:31,444 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:32,464 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:33,484 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:34,510 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:35,534 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:36,554 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:37,574 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:38,593 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:39,614 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:40,634 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:41,653 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:42,674 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:43,693 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:44,716 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:45,734 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:46,757 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:47,773 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:48,793 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:49,814 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:50,833 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:51,853 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:52,874 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:53,893 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:54,914 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:55,932 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:56,953 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:57,972 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:54:58,993 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:00,014 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:01,034 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:02,057 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:03,074 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:04,093 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:05,115 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:06,132 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:07,154 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:08,173 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:09,194 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:10,213 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:11,233 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:12,254 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:13,273 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:14,293 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:15,315 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:15,548 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shard Status For Leader After PreLeader Shutdown" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shard Status For Leader After PreLeader Shutdown 2025-10-15T00:55:15,959 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Start Mininet Connect To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Start Mininet Connect To Follower Node1 2025-10-15T00:55:16,335 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:16,345 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Add Bulk Flow From Follower" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Add Bulk Flow From Follower 2025-10-15T00:55:16,755 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Bulk Flows And Verify In Leader" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Bulk Flows And Verify In Leader 2025-10-15T00:55:17,187 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch Before Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch Before Cluster Restart 2025-10-15T00:55:17,354 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:17,612 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Restart Pre Leader From Cluster Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Restart Pre Leader From Cluster Node 2025-10-15T00:55:18,039 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Leader Restart 2025-10-15T00:55:18,377 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:19,394 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:20,413 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:21,433 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:22,454 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:23,474 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:24,494 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:25,514 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:26,534 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:27,553 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:28,573 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:29,594 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:30,613 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:31,633 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:32,653 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:33,673 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:34,692 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:35,712 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:36,733 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:37,754 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:38,774 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:39,793 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:40,814 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:41,835 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:42,853 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:43,877 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:44,893 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:45,916 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:46,934 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:47,954 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:48,975 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:49,997 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:51,014 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:52,034 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:53,057 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:54,074 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:55,093 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:56,113 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:57,135 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:58,154 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:55:59,174 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:00,194 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:01,213 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:02,233 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:03,253 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:04,274 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:05,293 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:06,316 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:07,333 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:08,353 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:09,385 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:10,404 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:11,423 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:12,444 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:13,465 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:14,483 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:15,503 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:16,524 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:17,543 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:18,562 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:19,583 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:20,603 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:21,623 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:22,643 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:23,663 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:24,683 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:25,704 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:26,724 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:27,743 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:28,763 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:29,784 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:30,803 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:31,823 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:32,850 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:33,863 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:34,883 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:35,904 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:36,923 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:37,943 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:38,964 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:39,983 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:41,003 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:42,023 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:43,043 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:44,063 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:45,082 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:46,103 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:47,124 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:48,144 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:49,163 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:50,183 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:51,203 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:52,222 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:53,245 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:54,268 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:55,283 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:56,303 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:57,323 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:58,343 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:56:59,363 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:00,383 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:01,403 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:02,424 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:03,443 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:04,465 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:05,483 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:06,504 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:07,524 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:08,544 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:09,563 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:10,584 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:11,603 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:12,623 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:13,645 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:14,664 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:15,684 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:16,705 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:17,723 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:18,744 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:19,763 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:20,787 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:21,804 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:22,823 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:23,844 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:24,865 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:25,886 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:26,903 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:27,924 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:28,944 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:29,964 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:30,984 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:32,004 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:33,023 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:34,043 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:35,063 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:36,082 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:37,103 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:38,123 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:39,143 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:40,163 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:41,184 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:42,204 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:43,224 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:44,244 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:45,264 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:46,283 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:47,304 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:48,323 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:49,343 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:50,364 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:51,383 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:52,403 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:53,423 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:54,443 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:55,463 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:56,484 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:57,504 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:58,524 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:57:59,542 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:00,563 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:01,583 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:02,603 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:03,623 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:04,643 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:05,663 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:06,683 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:07,704 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:08,723 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:09,743 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:10,764 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:11,783 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:12,804 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:13,823 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:14,843 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:15,863 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:16,883 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:17,903 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:18,924 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:19,943 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:20,963 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:21,990 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:23,004 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:24,023 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:25,043 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:26,064 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:27,083 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:28,103 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:29,122 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:30,144 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:31,163 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:32,184 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:33,203 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:34,224 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:35,242 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:36,264 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:37,284 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:38,303 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:39,323 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:40,343 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:41,362 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:42,384 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:43,403 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:44,424 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:45,443 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:46,463 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:47,483 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:48,503 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:49,523 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:50,543 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:51,562 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:52,583 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:53,605 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:54,624 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:55,643 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:56,664 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:57,683 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:58,703 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:58:59,724 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:00,743 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:01,764 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:02,783 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:03,803 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:04,834 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:05,854 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:06,873 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:07,893 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:08,913 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:09,933 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:10,953 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:11,973 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:12,992 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:14,014 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:15,034 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:16,053 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:17,073 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:18,094 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:19,114 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:20,133 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:21,153 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:22,174 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:23,194 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:24,215 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:25,235 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:26,253 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:27,273 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:28,296 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:29,313 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:30,333 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:31,353 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:32,373 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:33,396 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:34,416 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:35,434 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:36,453 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:37,473 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:38,493 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:39,513 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:40,534 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:41,559 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:42,584 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:43,603 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:44,624 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:45,643 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:46,664 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:47,683 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:48,704 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:49,723 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:50,744 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:51,763 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:52,783 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:53,804 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:54,823 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:55,843 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:56,863 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:57,883 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:58,903 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T00:59:59,923 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:00,943 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:01,963 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:02,985 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:04,013 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:05,033 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:06,053 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:07,072 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:08,093 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:09,113 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:10,133 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:11,154 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:12,173 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:13,193 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:14,214 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:15,234 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:16,254 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:17,273 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:18,294 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:19,314 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:20,333 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:21,354 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:22,373 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:23,393 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:24,413 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:25,434 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:26,453 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:27,473 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:28,493 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:29,514 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:30,533 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:31,555 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:32,573 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:33,594 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:34,614 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:35,634 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:36,652 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:37,674 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:38,693 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:39,713 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:40,732 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:41,753 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:42,774 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:43,794 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:44,813 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:45,833 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:46,853 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:47,873 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:48,894 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:49,914 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:50,933 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:51,953 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:52,973 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:53,993 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:55,014 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:56,034 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:57,053 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:58,073 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:00:59,098 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:00,114 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:01,133 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:02,153 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:03,174 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:04,193 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:05,213 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:06,233 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:07,252 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:08,275 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:08,451 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Data Recovery After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Data Recovery After Leader Restart 2025-10-15T01:01:09,293 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:10,314 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:11,334 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:12,353 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:13,373 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:14,394 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:15,413 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:16,434 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:17,453 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:18,699 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:19,714 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:20,733 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:21,753 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:22,773 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:23,794 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:24,813 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:25,835 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:26,853 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:27,873 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:28,894 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:29,914 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:30,934 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:31,953 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:32,973 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:33,994 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:35,014 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:36,033 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:37,054 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:38,073 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:39,094 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:40,114 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:41,137 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:42,156 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:43,173 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:44,193 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:45,215 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:46,233 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:47,255 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:48,273 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:49,293 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:50,318 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:51,335 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:52,353 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:53,372 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:54,399 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:55,415 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:56,438 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:57,454 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:58,473 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:01:59,493 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:00,512 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:01,533 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:02,555 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:03,574 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:04,593 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:05,613 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:06,633 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:07,653 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:08,674 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:09,693 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:10,713 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:11,733 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:12,753 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:13,773 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:14,794 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:15,814 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:16,833 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:17,854 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:18,873 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:19,894 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:20,913 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:21,933 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:22,954 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:23,972 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:24,993 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:26,013 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:27,032 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:28,053 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:29,074 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:30,093 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:31,114 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:32,134 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:33,153 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:34,173 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:35,194 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:36,213 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:37,234 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:38,254 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:39,274 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:40,294 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:41,313 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:42,335 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:43,353 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:44,373 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:45,393 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:46,413 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:47,434 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:48,454 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:49,473 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:49,751 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch After Leader Restart 2025-10-15T01:02:50,218 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Stop Mininet Connected To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Stop Mininet Connected To Follower Node1 2025-10-15T01:02:50,495 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:50,586 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Delete All Flows From Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Delete All Flows From Follower Node1 2025-10-15T01:02:50,948 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify No Flows In Leader Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify No Flows In Leader Node 2025-10-15T01:02:51,350 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Inventory Follower And Leader Before Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Inventory Follower And Leader Before Cluster Restart 2025-10-15T01:02:51,512 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:52,533 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:53,553 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:54,573 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:55,593 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:56,612 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:57,633 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:58,654 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:02:59,673 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:00,693 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:01,713 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:01,796 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Shutdown Follower From Cluster Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Shutdown Follower From Cluster Node 2025-10-15T01:03:02,154 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Follower Shutdown" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Follower Shutdown 2025-10-15T01:03:02,734 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:03,753 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:04,774 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:05,794 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:06,813 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:07,833 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:08,854 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:09,879 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:10,893 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:11,914 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:12,933 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:13,952 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:14,974 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:15,994 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:17,012 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:18,034 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:19,053 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:20,073 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:21,093 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:22,113 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:23,134 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:24,153 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:25,173 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:26,192 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:27,212 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:28,233 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:29,253 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:30,273 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:31,292 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:32,313 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:33,333 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:34,352 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:35,373 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:36,395 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:37,414 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:38,433 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:39,453 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:40,473 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:41,494 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:42,513 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:43,533 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:44,553 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:45,574 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:46,593 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:47,613 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:48,633 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:49,654 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:50,674 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:51,693 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:52,713 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:53,733 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:54,754 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:55,773 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:56,794 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:57,814 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:58,833 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:03:59,856 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:00,874 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:01,895 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:02,914 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:03,933 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:04,954 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:05,973 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:06,992 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:08,013 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:09,033 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:10,052 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:11,073 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:12,093 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:13,113 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:14,133 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:15,154 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:16,174 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:17,196 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:18,214 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:19,234 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:20,253 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:21,273 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:22,294 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:23,314 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:24,333 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:25,355 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:26,373 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:27,393 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:28,412 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:29,433 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:30,453 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:31,473 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:32,494 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:33,514 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:34,533 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:35,553 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:36,573 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:37,595 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:38,613 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:39,633 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:40,653 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:41,673 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:42,694 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:43,626 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Start Mininet Connect To Follower Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Start Mininet Connect To Follower Node 2025-10-15T01:04:43,713 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:44,242 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Add Bulk Flow From Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Add Bulk Flow From Follower Node1 2025-10-15T01:04:44,657 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Bulk Flows And Verify In Leader Before Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Bulk Flows And Verify In Leader Before Follower Restart 2025-10-15T01:04:44,734 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:45,080 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch Before Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch Before Follower Restart 2025-10-15T01:04:45,446 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Restart Follower From Cluster Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Restart Follower From Cluster Node 2025-10-15T01:04:45,753 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:45,845 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Follower Restart 2025-10-15T01:04:46,774 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:47,794 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:48,813 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:49,834 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:50,853 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:51,880 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:52,895 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:53,914 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:54,933 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:55,952 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:56,973 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:57,993 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:04:59,013 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:00,033 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:01,053 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:02,073 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:03,093 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:04,113 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:05,133 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:06,153 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:07,173 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:08,193 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:09,212 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:10,233 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:11,252 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:12,273 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:13,294 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:14,313 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:15,334 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:16,356 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:17,373 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:18,393 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:19,412 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:20,433 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:21,455 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:22,473 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:23,493 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:24,513 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:25,533 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:26,552 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:27,573 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:28,593 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:29,613 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:30,633 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:31,652 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:32,673 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:33,693 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:34,713 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:35,733 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:36,753 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:37,772 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:38,793 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:39,813 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:40,833 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:41,853 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:42,874 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:43,895 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:44,913 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:45,934 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:46,953 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:47,972 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:48,993 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:50,013 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:51,033 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:52,052 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:53,072 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:54,094 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:55,113 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:56,133 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:57,153 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:58,172 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:05:59,193 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:00,213 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:01,233 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:02,253 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:03,273 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:04,298 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:05,313 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:06,333 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:07,353 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:08,373 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:09,393 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:10,413 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:11,434 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:12,453 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:13,473 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:14,493 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:15,513 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:16,532 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:17,553 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:18,572 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:19,593 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:20,613 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:21,633 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:22,653 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:23,673 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:24,693 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:25,713 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:26,733 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:27,753 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:28,773 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:29,793 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:30,812 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:31,832 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:32,853 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:33,873 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:34,893 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:35,913 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:36,932 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:37,953 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:38,972 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:39,992 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:41,013 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:42,033 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:43,051 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:44,072 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:45,093 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:46,113 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:47,133 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:48,153 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:49,176 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:50,193 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:51,212 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:52,233 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:53,253 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:54,272 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:55,293 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:56,312 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:57,333 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:58,352 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:06:59,373 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:00,392 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:01,414 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:02,432 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:03,453 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:04,472 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:05,493 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:06,512 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:07,534 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:08,553 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:09,574 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:10,593 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:11,612 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:12,633 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:13,653 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:14,673 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:15,693 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:16,713 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:17,733 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:18,753 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:19,773 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:20,794 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:21,814 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:22,833 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:23,853 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:24,873 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:25,895 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:26,913 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:27,934 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:28,953 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:29,972 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:30,993 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:32,013 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:33,032 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:34,053 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:35,072 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:36,092 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:37,114 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:38,132 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:39,154 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:40,173 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:41,193 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:42,212 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:43,232 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:44,253 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:45,273 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:46,292 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:47,313 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:48,333 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:49,355 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:50,373 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:51,393 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:52,413 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:53,434 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:54,453 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:55,472 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:56,494 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:57,514 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:58,533 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:07:59,553 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:00,573 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:01,593 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:02,613 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:03,633 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:04,653 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:05,673 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:06,693 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:07,714 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:08,734 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:09,753 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:10,774 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:11,793 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:12,813 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:13,833 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:14,853 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:15,873 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:16,892 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:17,913 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:18,934 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:19,953 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:20,973 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:21,993 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:23,013 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:24,034 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:25,053 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:26,073 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:27,094 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:28,113 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:29,133 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:30,154 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:31,172 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:32,193 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:33,213 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:34,233 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:35,253 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:36,272 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:37,297 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:38,312 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:39,333 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:40,353 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:41,373 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:42,393 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:43,414 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:44,434 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:45,453 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:46,473 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:47,493 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:48,513 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:49,533 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:50,553 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:51,573 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:52,593 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:53,613 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:54,632 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:55,655 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:56,672 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:57,692 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:58,712 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:08:59,734 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:00,756 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:01,773 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:02,792 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:03,813 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:04,834 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:05,853 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:06,874 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:07,893 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:08,913 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:09,933 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:10,953 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:11,973 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:12,993 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:14,013 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:15,033 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:16,053 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:17,072 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:18,092 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:19,115 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:20,132 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:21,153 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:22,173 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:23,195 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:24,213 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:25,233 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:26,252 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:27,272 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:28,293 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:29,314 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:30,332 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:31,353 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:32,374 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:33,393 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:34,422 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:35,443 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:36,463 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:37,483 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:38,503 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:39,523 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:40,543 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:41,563 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:42,582 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:43,603 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:44,623 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:45,643 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:46,663 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:47,683 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:48,703 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:49,723 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:50,742 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:51,763 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:52,783 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:53,802 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:54,822 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:55,842 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:56,863 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:57,883 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:58,903 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:09:59,924 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:00,943 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:01,964 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:02,985 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:04,003 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:05,024 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:06,043 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:07,062 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:08,084 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:09,102 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:10,124 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:11,143 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:12,163 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:13,184 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:14,203 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:15,222 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:16,243 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:17,263 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:18,283 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:19,304 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:20,322 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:21,343 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:22,362 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:23,382 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:24,403 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:25,424 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:26,442 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:27,463 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:28,483 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:29,503 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:30,523 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:31,543 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:32,563 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:33,582 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:34,603 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:35,623 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:36,642 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:37,663 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:38,683 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:39,464 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Data Recovery After Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Data Recovery After Follower Restart 2025-10-15T01:10:39,704 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:40,723 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:41,742 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:42,764 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:43,782 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:44,803 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:45,823 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:46,843 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:47,863 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:48,884 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:49,903 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:50,923 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:51,942 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:52,962 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:53,984 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:55,004 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:56,023 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:57,043 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:58,063 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:10:59,083 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:00,103 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:01,123 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:02,144 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:03,164 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:04,182 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:05,204 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:06,223 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:07,243 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:08,262 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:09,282 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:10,303 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:11,324 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:12,344 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:13,364 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:14,383 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:15,403 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:16,423 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:17,443 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:18,463 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:19,483 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:20,504 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:21,523 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:22,542 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:23,563 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:24,584 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:25,604 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:26,623 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:27,643 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:28,663 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:29,683 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:30,703 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:31,723 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:32,743 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:33,764 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:34,783 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:35,804 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:36,824 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:37,843 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:38,863 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:39,884 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:40,903 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:41,923 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:42,944 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:43,963 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:44,983 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:46,002 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:47,024 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:48,042 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:49,063 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:50,083 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:51,103 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:52,123 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:53,144 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:54,163 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:55,183 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:56,203 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:57,223 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:58,243 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:11:59,263 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:00,283 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:01,303 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:02,322 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:03,343 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:04,363 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:05,383 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:06,407 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:07,422 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:08,444 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:09,463 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:10,484 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:11,502 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:12,524 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:13,543 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:14,563 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:15,582 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:16,603 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:17,623 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:18,642 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:19,664 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:20,683 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:20,812 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch After Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch After Follower Restart 2025-10-15T01:12:21,193 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Stop Mininet Connected To Follower Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Stop Mininet Connected To Follower Node 2025-10-15T01:12:21,570 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Delete All Flows From Follower Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Delete All Flows From Follower Node 2025-10-15T01:12:21,703 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:21,979 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify No Flows In Leader Node After Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify No Flows In Leader Node After Follower Restart 2025-10-15T01:12:22,723 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:23,743 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:24,593 | INFO | sshd-SshServer[40a42786](port=8101)-nio2-thread-2 | ServerSessionImpl | 125 - org.apache.sshd.osgi - 2.15.0 | Session karaf@/10.30.170.242:43978 authenticated 2025-10-15T01:12:24,763 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:25,235 | INFO | pipe-log:log "ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/030__Cluster_HA_Data_Recovery_BulkFlow_Single_Switch.robot" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/030__Cluster_HA_Data_Recovery_BulkFlow_Single_Switch.robot 2025-10-15T01:12:25,619 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Check Shards Status And Initialize Variables" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Check Shards Status And Initialize Variables 2025-10-15T01:12:25,783 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:26,803 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:27,822 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:28,222 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Follower Before Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Follower Before Cluster Restart 2025-10-15T01:12:28,843 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:29,864 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:30,883 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:31,903 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:32,924 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:33,943 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:34,963 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:35,983 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:37,003 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:38,023 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:39,043 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:39,777 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Follower Node1 2025-10-15T01:12:40,063 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:40,139 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Follower" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Follower 2025-10-15T01:12:40,507 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster 2025-10-15T01:12:40,872 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Cluster Restart 2025-10-15T01:12:41,085 | WARN | ForkJoinPool.commonPool-worker-4 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Connection attempt failed at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.onConnectResponse(AbstractShardBackendResolver.java:185) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.lambda$connectShard$2(AbstractShardBackendResolver.java:178) ~[?:?] at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:863) [?:?] at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:841) [?:?] at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:483) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.access.commands.NotLeaderException: Actor Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-inventory-config#-826035993] is not the current leader at org.opendaylight.controller.cluster.datastore.Shard.handleConnectClient(Shard.java:481) ~[?:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:297) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[?:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[?:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[?:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[?:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[?:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[?:?] ... 5 more 2025-10-15T01:12:41,202 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill All Cluster Nodes" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill All Cluster Nodes 2025-10-15T01:12:41,766 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.193:2550], control stream] Upstream failed, cause: StreamTcpException: The connection closed with error: Connection reset Oct 15, 2025 1:13:18 AM org.apache.karaf.main.lock.SimpleFileLock lock INFO: Trying to lock /tmp/karaf-0.23.0/lock Oct 15, 2025 1:13:18 AM org.apache.karaf.main.lock.SimpleFileLock lock INFO: Lock acquired Oct 15, 2025 1:13:18 AM org.apache.karaf.main.Main$KarafLockCallback lockAcquired INFO: Lock acquired. Setting startlevel to 100 2025-10-15T01:13:19,175 | INFO | CM Configuration Updater (ManagedService Update: pid=[org.ops4j.pax.logging]) | EventAdminConfigurationNotifier | 5 - org.ops4j.pax.logging.pax-logging-log4j2 - 2.3.0 | Logging configuration changed. (Event Admin service unavailable - no notification sent). 2025-10-15T01:13:19,255 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.aries.blueprint.cm/1.3.2 has been started 2025-10-15T01:13:19,295 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.aries.blueprint.core/1.10.3 has been started 2025-10-15T01:13:19,350 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Starting JMX OSGi agent 2025-10-15T01:13:19,363 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering MBean with ObjectName [osgi.core:service=permissionadmin,version=1.2,framework=org.eclipse.osgi,uuid=43f410db-7254-4fd6-bfc5-ec7c66cfc64a] for service with service.id [15] 2025-10-15T01:13:19,364 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering MBean with ObjectName [osgi.compendium:service=cm,version=1.3,framework=org.eclipse.osgi,uuid=43f410db-7254-4fd6-bfc5-ec7c66cfc64a] for service with service.id [40] 2025-10-15T01:13:19,384 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | ROOT | 93 - org.apache.felix.scr - 2.2.6 | bundle org.apache.felix.scr:2.2.6 (93) Starting with globalExtender setting: false 2025-10-15T01:13:19,388 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | ROOT | 93 - org.apache.felix.scr - 2.2.6 | bundle org.apache.felix.scr:2.2.6 (93) Version = 2.2.6 2025-10-15T01:13:19,524 | INFO | activator-1-thread-1 | Activator | 113 - org.apache.karaf.management.server - 4.4.8 | Setting java.rmi.server.hostname system property to 127.0.0.1 2025-10-15T01:13:19,625 | INFO | activator-1-thread-2 | Activator | 99 - org.apache.karaf.deployer.features - 4.4.8 | Deployment finished. Registering FeatureDeploymentListener 2025-10-15T01:13:19,651 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.service.cm.ConfigurationAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@61282d8c with name osgi.compendium:service=cm,version=1.3,framework=org.eclipse.osgi,uuid=43f410db-7254-4fd6-bfc5-ec7c66cfc64a 2025-10-15T01:13:19,652 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.BundleStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@61282d8c with name osgi.core:type=bundleState,version=1.7,framework=org.eclipse.osgi,uuid=43f410db-7254-4fd6-bfc5-ec7c66cfc64a 2025-10-15T01:13:19,657 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.ServiceStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@61282d8c with name osgi.core:type=serviceState,version=1.7,framework=org.eclipse.osgi,uuid=43f410db-7254-4fd6-bfc5-ec7c66cfc64a 2025-10-15T01:13:19,657 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.service.permissionadmin.PermissionAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@61282d8c with name osgi.core:service=permissionadmin,version=1.2,framework=org.eclipse.osgi,uuid=43f410db-7254-4fd6-bfc5-ec7c66cfc64a 2025-10-15T01:13:19,658 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.PackageStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@61282d8c with name osgi.core:type=packageState,version=1.5,framework=org.eclipse.osgi,uuid=43f410db-7254-4fd6-bfc5-ec7c66cfc64a 2025-10-15T01:13:19,658 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.wiring.BundleWiringStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@61282d8c with name osgi.core:type=wiringState,version=1.1,framework=org.eclipse.osgi,uuid=43f410db-7254-4fd6-bfc5-ec7c66cfc64a 2025-10-15T01:13:19,658 | INFO | activator-1-thread-1 | core | 83 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.FrameworkMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@61282d8c with name osgi.core:type=framework,version=1.7,framework=org.eclipse.osgi,uuid=43f410db-7254-4fd6-bfc5-ec7c66cfc64a 2025-10-15T01:13:19,712 | INFO | activator-1-thread-1 | ServiceComponentRuntimeMBeanImpl | 115 - org.apache.karaf.scr.management - 4.4.8 | Activating the Apache Karaf ServiceComponentRuntime MBean 2025-10-15T01:13:19,803 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.bundle.core/4.4.8 2025-10-15T01:13:19,823 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.config.command/4.4.8 2025-10-15T01:13:19,938 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.deployer.kar/4.4.8 2025-10-15T01:13:19,940 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.diagnostic.core/4.4.8 2025-10-15T01:13:19,960 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.features.command/4.4.8 2025-10-15T01:13:19,972 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.http.core/4.4.8. Missing service: [org.apache.karaf.http.core.ProxyService] 2025-10-15T01:13:19,979 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.instance.core/4.4.8 2025-10-15T01:13:19,988 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.jaas.command/4.4.8 2025-10-15T01:13:19,989 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Updating commands for bundle org.apache.karaf.jaas.command/4.4.8 2025-10-15T01:13:19,990 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Updating commands for bundle org.apache.karaf.jaas.command/4.4.8 2025-10-15T01:13:19,992 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.kar.core/4.4.8 2025-10-15T01:13:19,996 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.log.core/4.4.8 2025-10-15T01:13:19,997 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.package.core/4.4.8 2025-10-15T01:13:19,999 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.service.core/4.4.8 2025-10-15T01:13:20,007 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.shell.commands/4.4.8 2025-10-15T01:13:20,007 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Updating commands for bundle org.apache.karaf.shell.commands/4.4.8 2025-10-15T01:13:20,010 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | Activator | 120 - org.apache.karaf.shell.core - 4.4.8 | Not starting local console. To activate set karaf.startLocalConsole=true 2025-10-15T01:13:20,050 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.karaf.shell.core/4.4.8 has been started 2025-10-15T01:13:20,089 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.shell.ssh/4.4.8. Missing service: [org.apache.sshd.server.SshServer] 2025-10-15T01:13:20,134 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.system.core/4.4.8 2025-10-15T01:13:20,146 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.web.core/4.4.8. Missing service: [org.apache.karaf.web.WebContainerService] 2025-10-15T01:13:20,193 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | Activator | 392 - org.ops4j.pax.web.pax-web-extender-war - 8.0.33 | Configuring WAR extender thread pool. Pool size = 3 2025-10-15T01:13:20,245 | INFO | activator-1-thread-1 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.shell.ssh/4.4.8 2025-10-15T01:13:20,252 | INFO | activator-1-thread-1 | DefaultIoServiceFactoryFactory | 125 - org.apache.sshd.osgi - 2.15.0 | No detected/configured IoServiceFactoryFactory; using Nio2ServiceFactoryFactory 2025-10-15T01:13:20,304 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | Activator | 393 - org.ops4j.pax.web.pax-web-extender-whiteboard - 8.0.33 | Starting Pax Web Whiteboard Extender 2025-10-15T01:13:20,348 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | log | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Logging initialized @3323ms to org.eclipse.jetty.util.log.Slf4jLog 2025-10-15T01:13:20,374 | INFO | CM Configuration Updater (ManagedService Update: pid=[org.ops4j.pax.web]) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Scheduling Pax Web reconfiguration because configuration has changed 2025-10-15T01:13:20,375 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | EventAdmin support enabled, WAB events will be posted to EventAdmin topics. 2025-10-15T01:13:20,375 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Pax Web Runtime started 2025-10-15T01:13:20,376 | INFO | paxweb-config-3-thread-1 (change config) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Scheduling Pax Web reconfiguration because ServerControllerFactory has been registered 2025-10-15T01:13:20,381 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Starting BlueprintBundleTracker 2025-10-15T01:13:20,391 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.apache.karaf.shell.core_4.4.8 [120] was successfully created 2025-10-15T01:13:20,395 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.apache.aries.blueprint.core_1.10.3 [79] was successfully created 2025-10-15T01:13:20,396 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.apache.aries.blueprint.cm_1.3.2 [78] was successfully created 2025-10-15T01:13:20,445 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Configuring server controller org.ops4j.pax.web.service.jetty.internal.JettyServerController 2025-10-15T01:13:20,445 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Configuring JettyServerController{configuration=043b67ec-7780-4adc-a544-fd4487fb326a,state=UNCONFIGURED} 2025-10-15T01:13:20,445 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating Jetty server instance using configuration properties. 2025-10-15T01:13:20,463 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Processing Jetty configuration from files: [etc/jetty.xml] 2025-10-15T01:13:20,603 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Found configured connector "jetty-default": 0.0.0.0:8181 2025-10-15T01:13:20,604 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Using configured jetty-default@4456cb45{HTTP/1.1, (http/1.1)}{0.0.0.0:8181} as non secure connector for address: 0.0.0.0:8181 2025-10-15T01:13:20,605 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Eagerly starting Jetty thread pool QueuedThreadPool[qtp119977020]@726b43c{STOPPED,0<=0<=200,i=0,r=-1,q=0}[NO_TRY] 2025-10-15T01:13:20,611 | INFO | paxweb-config-3-thread-1 (change controller) | JettyFactory | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding JMX support to Jetty server 2025-10-15T01:13:20,626 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.jdbc.core/4.4.8 2025-10-15T01:13:20,651 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Starting server controller org.ops4j.pax.web.service.jetty.internal.JettyServerController 2025-10-15T01:13:20,651 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting JettyServerController{configuration=043b67ec-7780-4adc-a544-fd4487fb326a,state=STOPPED} 2025-10-15T01:13:20,652 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Server@6ecd335d{STOPPED}[9.4.57.v20241219] 2025-10-15T01:13:20,652 | INFO | paxweb-config-3-thread-1 (change controller) | Server | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | jetty-9.4.57.v20241219; built: 2025-01-08T21:24:30.412Z; git: df524e6b29271c2e09ba9aea83c18dc9db464a31; jvm 21.0.8+9-Ubuntu-0ubuntu122.04.1 2025-10-15T01:13:20,677 | INFO | paxweb-config-3-thread-1 (change controller) | session | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | DefaultSessionIdManager workerName=node0 2025-10-15T01:13:20,677 | INFO | paxweb-config-3-thread-1 (change controller) | session | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | No SessionScavenger set, using defaults 2025-10-15T01:13:20,679 | INFO | paxweb-config-3-thread-1 (change controller) | session | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | node0 Scavenging every 660000ms 2025-10-15T01:13:20,740 | INFO | paxweb-config-3-thread-1 (change controller) | AbstractConnector | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started jetty-default@4456cb45{HTTP/1.1, (http/1.1)}{0.0.0.0:8181} 2025-10-15T01:13:20,740 | INFO | paxweb-config-3-thread-1 (change controller) | Server | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started @3723ms 2025-10-15T01:13:20,742 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering HttpService factory 2025-10-15T01:13:20,743 | INFO | paxweb-config-3-thread-1 (change controller) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.apache.karaf.http.core_4.4.8 [105]] 2025-10-15T01:13:20,755 | INFO | paxweb-config-3-thread-1 (change controller) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.apache.karaf.web.core_4.4.8 [124]] 2025-10-15T01:13:20,760 | INFO | activator-1-thread-2 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.http.core/4.4.8 2025-10-15T01:13:20,766 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-15T01:13:20,766 | INFO | activator-1-thread-2 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.web.core/4.4.8 2025-10-15T01:13:20,770 | INFO | paxweb-config-3-thread-1 (change controller) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.jolokia.osgi_1.7.2 [155]] 2025-10-15T01:13:20,770 | INFO | HttpService->Whiteboard (add HttpService) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.ops4j.pax.web.pax-web-extender-whiteboard_8.0.33 [393]] 2025-10-15T01:13:20,772 | INFO | HttpService->WarExtender (add HttpService) | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.ops4j.pax.web.pax-web-extender-war_8.0.33 [392]] 2025-10-15T01:13:20,777 | INFO | paxweb-config-3-thread-1 (change controller) | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-4,contextPath='/'} 2025-10-15T01:13:20,778 | INFO | paxweb-config-3-thread-1 (change controller) | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-3,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@3b2315fa,contexts=[{HS,OCM-5,context:1802559495,/}]} 2025-10-15T01:13:20,779 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-3,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@3b2315fa,contexts=null}", size=4} 2025-10-15T01:13:20,779 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-4,contextPath='/'} 2025-10-15T01:13:20,806 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CustomFilterAdapterConfigurationImpl | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Custom filter properties updated: {service.pid=org.opendaylight.aaa.filterchain, osgi.ds.satisfying.condition.target=(osgi.condition.id=true), customFilterList=, component.name=org.opendaylight.aaa.filterchain.configuration.impl.CustomFilterAdapterConfigurationImpl, felix.fileinstall.filename=file:/tmp/karaf-0.23.0/etc/org.opendaylight.aaa.filterchain.cfg, component.id=4, Filter.target=(org.opendaylight.aaa.filterchain.filter=true)} 2025-10-15T01:13:20,812 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{HS,id=OCM-5,name='context:1802559495',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [155],contextId='context:1802559495',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@6b70e007}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@66266857{/,null,STOPPED} 2025-10-15T01:13:20,814 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@66266857{/,null,STOPPED} 2025-10-15T01:13:20,815 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-3,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@3b2315fa,contexts=[{HS,OCM-5,context:1802559495,/}]} 2025-10-15T01:13:20,837 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/" with default Osgi Context OsgiContextModel{HS,id=OCM-5,name='context:1802559495',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [155],contextId='context:1802559495',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@6b70e007}} 2025-10-15T01:13:20,855 | INFO | paxweb-config-3-thread-1 (change controller) | osgi | 155 - org.jolokia.osgi - 1.7.2 | No access restrictor found, access to any MBean is allowed 2025-10-15T01:13:20,886 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.web.servlet.ServletSupport), (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.aaa.api.IIDMStore), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.aaa.api.AuthenticationService), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth)] 2025-10-15T01:13:20,898 | INFO | paxweb-config-3-thread-1 (change controller) | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@66266857{/,null,AVAILABLE} 2025-10-15T01:13:20,899 | INFO | paxweb-config-3-thread-1 (change controller) | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{HS,id=OCM-5,name='context:1802559495',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [155],contextId='context:1802559495',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@6b70e007}}} as OSGi service for "/" context path 2025-10-15T01:13:20,901 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering HttpServiceRuntime 2025-10-15T01:13:20,904 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-1,name='default',path='/',bundle=org.ops4j.pax.web.pax-web-extender-whiteboard,context=(supplier)}", size=1} 2025-10-15T01:13:20,904 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-1,name='default',path='/',bundle=org.ops4j.pax.web.pax-web-extender-whiteboard,context=(supplier)} to o.o.p.w.s.j.i.PaxWebServletContextHandler@66266857{/,null,AVAILABLE} 2025-10-15T01:13:20,920 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.opendaylight.aaa.shiro_0.21.2 [172]] 2025-10-15T01:13:20,922 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]} 2025-10-15T01:13:20,922 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]}", size=1} 2025-10-15T01:13:20,922 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]} 2025-10-15T01:13:20,955 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.web.servlet.ServletSupport), (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.aaa.api.IIDMStore), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth)] 2025-10-15T01:13:20,963 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.aaa.api.IIDMStore), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth)] 2025-10-15T01:13:20,985 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | FileAkkaConfigurationReader | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | File-based Pekko configuration reader enabled 2025-10-15T01:13:20,997 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiActorSystemProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Actor System provider starting 2025-10-15T01:13:21,164 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | ActorSystemProviderImpl | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Creating new ActorSystem 2025-10-15T01:13:21,520 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Slf4jLogger | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Slf4jLogger started 2025-10-15T01:13:21,738 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ArteryTransport | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Remoting started with transport [Artery tcp]; listening on address [pekko://opendaylight-cluster-data@10.30.170.142:2550] with UID [4400250880795484691] 2025-10-15T01:13:21,747 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Starting up, Pekko version [1.0.3] ... 2025-10-15T01:13:21,795 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Registered cluster JMX MBean [pekko:type=Cluster] 2025-10-15T01:13:21,801 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Started up successfully 2025-10-15T01:13:21,842 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | SBR started. Config: strategy [KeepMajority], stable-after [7 seconds], down-all-when-unstable [5250 milliseconds], selfUniqueAddress [pekko://opendaylight-cluster-data@10.30.170.142:2550#4400250880795484691], selfDc [default]. 2025-10-15T01:13:22,062 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiActorSystemProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Actor System provider started 2025-10-15T01:13:22,083 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | FileModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Shard configuration provider started 2025-10-15T01:13:22,126 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.opendaylight.infrautils.diagstatus-shell/7.1.7. Missing service: [org.opendaylight.infrautils.diagstatus.DiagStatusServiceMBean] 2025-10-15T01:13:22,155 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.126:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.126/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:13:22,155 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.126:2550], control stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.126/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:13:22,222 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/system/cluster/core/daemon/firstSeedNodeProcess-1#1636660794]], but this node is not initialized yet 2025-10-15T01:13:22,263 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.7 | ThreadFactory for SystemReadyService created 2025-10-15T01:13:22,265 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.7 | Now starting to provide full system readiness status updates (see TestBundleDiag's logs)... 2025-10-15T01:13:22,266 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | DiagStatusServiceImpl | 199 - org.opendaylight.infrautils.diagstatus-impl - 7.1.7 | Diagnostic Status Service started 2025-10-15T01:13:22,267 | INFO | SystemReadyService-0 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.7 | checkBundleDiagInfos() started... 2025-10-15T01:13:22,271 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | MBeanUtils | 198 - org.opendaylight.infrautils.diagstatus-api - 7.1.7 | MBean registration for org.opendaylight.infrautils.diagstatus:type=SvcStatus SUCCESSFUL. 2025-10-15T01:13:22,271 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | DiagStatusServiceMBeanImpl | 199 - org.opendaylight.infrautils.diagstatus-impl - 7.1.7 | Diagnostic Status Service management started 2025-10-15T01:13:22,271 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.infrautils.diagstatus-shell/7.1.7 2025-10-15T01:13:22,295 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.openflowplugin.api.openflow.FlowGroupCacheManager), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (objectClass=org.opendaylight.openflowplugin.api.openflow.mastership.MastershipChangeServiceManager), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.applications.reconciliation.ReconciliationManager), (objectClass=org.opendaylight.serviceutils.srm.ServiceRecoveryRegistry), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-15T01:13:22,304 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.openflowplugin.applications.deviceownershipservice.DeviceOwnershipService), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-15T01:13:22,308 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.opendaylight.openflowplugin.applications.reconciliation-framework/0.20.1. Missing service: [org.opendaylight.openflowplugin.applications.reconciliation.ReconciliationManager] 2025-10-15T01:13:22,313 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.mdsal.binding.api.NotificationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.NotificationPublishService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService)] 2025-10-15T01:13:22,373 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationServiceFactory)] 2025-10-15T01:13:22,381 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer))] 2025-10-15T01:13:22,382 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.openflowplugin.api.openflow.FlowGroupCacheManager), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.applications.reconciliation.ReconciliationManager), (objectClass=org.opendaylight.serviceutils.srm.ServiceRecoveryRegistry), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-15T01:13:22,383 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.openflowplugin.api.openflow.FlowGroupCacheManager), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.serviceutils.srm.ServiceRecoveryRegistry), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-15T01:13:22,388 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | ReconciliationManagerImpl | 302 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.20.1 | ReconciliationManager started 2025-10-15T01:13:22,388 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.openflowplugin.applications.reconciliation-framework/0.20.1 2025-10-15T01:13:22,389 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.serviceutils.srm.ServiceRecoveryRegistry), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-15T01:13:22,394 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | MessageIntelligenceAgencyImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Registered MBean org.opendaylight.openflowplugin.impl.statistics.ofpspecific:type=MessageIntelligenceAgencyMXBean 2025-10-15T01:13:22,396 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.openflowplugin.impl/0.20.1 2025-10-15T01:13:22,417 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-15T01:13:22,419 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OpenflowServiceRecoveryHandlerImpl | 299 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.20.1 | Registering openflowplugin service recovery handlers 2025-10-15T01:13:22,423 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.opendaylight.openflowplugin.srm-shell/0.20.1. Missing service: [org.opendaylight.mdsal.binding.api.DataBroker, org.opendaylight.serviceutils.srm.spi.RegistryControl] 2025-10-15T01:13:22,427 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | SimpleBindingDOMCodecFactory | 325 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.17 | Binding/DOM Codec enabled 2025-10-15T01:13:22,433 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiBindingDOMCodec | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.17 | Binding/DOM Codec activating 2025-10-15T01:13:22,435 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiBindingDOMCodec | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.17 | Binding/DOM Codec activated 2025-10-15T01:13:22,440 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | DefaultBindingRuntimeGenerator | 328 - org.opendaylight.yangtools.binding-generator - 14.0.17 | Binding/YANG type support activated 2025-10-15T01:13:22,454 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiBindingRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Binding Runtime activating 2025-10-15T01:13:22,454 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiBindingRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Binding Runtime activated 2025-10-15T01:13:22,509 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiModelRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Model Runtime starting 2025-10-15T01:13:22,528 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | KarafFeaturesSupport | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Will attempt to integrate with Karaf FeaturesService 2025-10-15T01:13:23,030 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | NettyTransportSupport | 284 - org.opendaylight.netconf.transport-api - 9.0.1 | Netty transport backed by epoll(2) 2025-10-15T01:13:23,240 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-306099532]], but this node is not initialized yet 2025-10-15T01:13:23,270 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | SharedEffectiveModelContextFactory | 379 - org.opendaylight.yangtools.yang-parser-impl - 14.0.17 | Using weak references 2025-10-15T01:13:25,247 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiModuleInfoSnapshotImpl | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | EffectiveModelContext generation 1 activated 2025-10-15T01:13:25,248 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiDOMSchemaService | 251 - org.opendaylight.mdsal.mdsal-dom-schema-osgi - 14.0.18 | DOM Schema services activated 2025-10-15T01:13:25,248 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiDOMSchemaService | 251 - org.opendaylight.mdsal.mdsal-dom-schema-osgi - 14.0.18 | Updating context to generation 1 2025-10-15T01:13:25,252 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | DOMRpcRouter | 250 - org.opendaylight.mdsal.mdsal-dom-broker - 14.0.18 | DOM RPC/Action router started 2025-10-15T01:13:25,257 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiRemoteOpsProvider | 197 - org.opendaylight.controller.sal-remoterpc-connector - 11.0.2 | Remote Operations service starting 2025-10-15T01:13:25,260 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiRemoteOpsProvider | 197 - org.opendaylight.controller.sal-remoterpc-connector - 11.0.2 | Remote Operations service started 2025-10-15T01:13:25,351 | INFO | opendaylight-cluster-data-pekko.persistence.dispatchers.default-plugin-dispatcher-33 | SegmentedFileJournal | 191 - org.opendaylight.controller.sal-akka-segmented-journal - 11.0.2 | Initialized with root directory segmented-journal with storage MAPPED 2025-10-15T01:13:26,033 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiBindingRuntimeContextImpl | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | BindingRuntimeContext generation 1 activated 2025-10-15T01:13:26,053 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiBindingDOMCodecServicesImpl | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.17 | Binding/DOM Codec generation 1 activated 2025-10-15T01:13:26,054 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiDatastoreContextIntrospectorFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Datastore Context Introspector activated 2025-10-15T01:13:26,057 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Distributed Datastore type CONFIGURATION starting 2025-10-15T01:13:26,335 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Create data store instance of type : config 2025-10-15T01:13:26,336 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Config file exists - reading config from it 2025-10-15T01:13:26,337 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Config file exists - reading config from it 2025-10-15T01:13:26,345 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | AbstractDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Creating ShardManager : shardmanager-config 2025-10-15T01:13:26,372 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Starting ShardManager shard-manager-config 2025-10-15T01:13:26,387 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Recovery complete 2025-10-15T01:13:26,453 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Data store config is using tell-based protocol 2025-10-15T01:13:26,456 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Config file exists - reading config from it 2025-10-15T01:13:26,457 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | AbstractModuleShardConfigProvider | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Config file exists - reading config from it 2025-10-15T01:13:26,458 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Distributed Datastore type OPERATIONAL starting 2025-10-15T01:13:26,460 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Create data store instance of type : operational 2025-10-15T01:13:26,460 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | AbstractDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Creating ShardManager : shardmanager-operational 2025-10-15T01:13:26,469 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Starting ShardManager shard-manager-operational 2025-10-15T01:13:26,472 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | DistributedDataStoreFactory | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Data store operational is using tell-based protocol 2025-10-15T01:13:26,475 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-default-config: Shard created, persistent : true 2025-10-15T01:13:26,476 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | GlobalBindingDOMCodecServices | 326 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.17 | Global Binding/DOM Codec activated with generation 1 2025-10-15T01:13:26,476 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-topology-config: Shard created, persistent : true 2025-10-15T01:13:26,478 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: Shard created, persistent : true 2025-10-15T01:13:26,485 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-toaster-config: Shard created, persistent : true 2025-10-15T01:13:26,492 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiBlockingBindingNormalizer | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter activated 2025-10-15T01:13:26,500 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Recovery complete 2025-10-15T01:13:26,504 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for MountPointService activated 2025-10-15T01:13:26,504 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-default-operational: Shard created, persistent : false 2025-10-15T01:13:26,505 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-topology-operational: Shard created, persistent : false 2025-10-15T01:13:26,506 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-operational: Shard created, persistent : false 2025-10-15T01:13:26,506 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-toaster-operational: Shard created, persistent : false 2025-10-15T01:13:26,509 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | DOMNotificationRouter | 250 - org.opendaylight.mdsal.mdsal-dom-broker - 14.0.18 | DOM Notification Router started 2025-10-15T01:13:26,511 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.NotificationPublishService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService)] 2025-10-15T01:13:26,514 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-config/member-2-shard-inventory-config/member-2-shard-inventory-config-notifier#566161392 created and ready for shard:member-2-shard-inventory-config 2025-10-15T01:13:26,514 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-operational/member-2-shard-inventory-operational/member-2-shard-inventory-operational-notifier#-1218838849 created and ready for shard:member-2-shard-inventory-operational 2025-10-15T01:13:26,515 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for NotificationService activated 2025-10-15T01:13:26,516 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Starting recovery with journal batch size 1 2025-10-15T01:13:26,517 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-config/member-2-shard-topology-config/member-2-shard-topology-config-notifier#-7290538 created and ready for shard:member-2-shard-topology-config 2025-10-15T01:13:26,518 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config: Starting recovery with journal batch size 1 2025-10-15T01:13:26,518 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-config/member-2-shard-toaster-config/member-2-shard-toaster-config-notifier#-562926499 created and ready for shard:member-2-shard-toaster-config 2025-10-15T01:13:26,518 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config: Starting recovery with journal batch size 1 2025-10-15T01:13:26,518 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config: Starting recovery with journal batch size 1 2025-10-15T01:13:26,519 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-operational/member-2-shard-topology-operational/member-2-shard-topology-operational-notifier#-292692369 created and ready for shard:member-2-shard-topology-operational 2025-10-15T01:13:26,519 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Starting recovery with journal batch size 1 2025-10-15T01:13:26,520 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-operational/member-2-shard-toaster-operational/member-2-shard-toaster-operational-notifier#780365619 created and ready for shard:member-2-shard-toaster-operational 2025-10-15T01:13:26,520 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-config/member-2-shard-default-config/member-2-shard-default-config-notifier#-840664434 created and ready for shard:member-2-shard-default-config 2025-10-15T01:13:26,520 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Starting recovery with journal batch size 1 2025-10-15T01:13:26,521 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-operational/member-2-shard-default-operational/member-2-shard-default-operational-notifier#-2034053892 created and ready for shard:member-2-shard-default-operational 2025-10-15T01:13:26,521 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational: Starting recovery with journal batch size 1 2025-10-15T01:13:26,521 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Starting recovery with journal batch size 1 2025-10-15T01:13:26,522 | INFO | opendaylight-cluster-data-pekko.persistence.dispatchers.default-plugin-dispatcher-48 | SegmentedFileJournal | 191 - org.opendaylight.controller.sal-akka-segmented-journal - 11.0.2 | Initialized with root directory segmented-journal with storage DISK 2025-10-15T01:13:26,536 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService)] 2025-10-15T01:13:26,537 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for NotificationPublishService activated 2025-10-15T01:13:26,539 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-15T01:13:26,539 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService), (objectClass=org.opendaylight.openflowplugin.applications.deviceownershipservice.DeviceOwnershipService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-15T01:13:26,539 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for RpcService activated 2025-10-15T01:13:26,545 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-15T01:13:26,549 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for RpcProviderService activated 2025-10-15T01:13:26,619 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-15T01:13:26,620 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.applications.deviceownershipservice.DeviceOwnershipService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-15T01:13:26,622 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for ActionService activated 2025-10-15T01:13:26,633 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for ActionProviderService activated 2025-10-15T01:13:26,634 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | DynamicBindingAdapter | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | 8 DOMService trackers started 2025-10-15T01:13:26,635 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-15T01:13:26,635 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2025-10-15T01:13:26,637 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config: journal open: applyTo=0 2025-10-15T01:13:26,637 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: journal open: applyTo=0 2025-10-15T01:13:26,644 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | GlobalBindingRuntimeContext | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Global BindingRuntimeContext generation 1 activated 2025-10-15T01:13:26,645 | INFO | Start Level: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | OSGiModelRuntime | 333 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.17 | Model Runtime started 2025-10-15T01:13:26,645 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config: journal open: applyTo=77 2025-10-15T01:13:26,646 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config: journal open: applyTo=0 2025-10-15T01:13:26,646 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: journal open: applyTo=0 2025-10-15T01:13:26,653 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: journal open: applyTo=0 2025-10-15T01:13:26,659 | INFO | Framework Event Dispatcher: Equinox Container: 43f410db-7254-4fd6-bfc5-ec7c66cfc64a | Main | 4 - org.ops4j.pax.logging.pax-logging-api - 2.3.0 | Karaf started in 8s. Bundle stats: 397 active, 398 total 2025-10-15T01:13:26,661 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-15T01:13:26,662 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-15T01:13:26,662 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-15T01:13:26,667 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-inventory-operational , received role change from null to Follower 2025-10-15T01:13:26,667 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-default-operational , received role change from null to Follower 2025-10-15T01:13:26,667 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-15T01:13:26,677 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-toaster-config , received role change from null to Follower 2025-10-15T01:13:26,677 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-15T01:13:26,677 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-toaster-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-10-15T01:13:26,677 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-default-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-10-15T01:13:26,677 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-topology-operational , received role change from null to Follower 2025-10-15T01:13:26,678 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-topology-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-10-15T01:13:26,678 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-toaster-config from null to Follower 2025-10-15T01:13:26,678 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-default-operational from null to Follower 2025-10-15T01:13:26,678 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-topology-operational from null to Follower 2025-10-15T01:13:26,678 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-inventory-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-10-15T01:13:26,679 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-inventory-operational from null to Follower 2025-10-15T01:13:26,679 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-topology-config , received role change from null to Follower 2025-10-15T01:13:26,680 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-topology-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-10-15T01:13:26,680 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-topology-config from null to Follower 2025-10-15T01:13:26,685 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational: journal open: applyTo=0 2025-10-15T01:13:26,689 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-15T01:13:26,690 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-toaster-operational , received role change from null to Follower 2025-10-15T01:13:26,690 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-toaster-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2025-10-15T01:13:26,690 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-toaster-operational from null to Follower 2025-10-15T01:13:26,691 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config: Recovery completed - Switching actor to Follower - last log index = -1, last log term = -1, snapshot index = -1, snapshot term = -1, journal size = 0 2025-10-15T01:13:26,720 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: journal open: applyTo=20021 2025-10-15T01:13:26,818 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-default-config , received role change from null to Follower 2025-10-15T01:13:26,818 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-default-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-10-15T01:13:26,819 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-default-config from null to Follower 2025-10-15T01:13:27,291 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | Recovery snapshot applied for member-2-shard-inventory-config in 340.6 ms: snapshotIndex=19427, snapshotTerm=2, journal-size=0 2025-10-15T01:13:27,292 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PekkoRecovery | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Recovery completed in 341.5 ms - Switching actor to Follower - last log index = 19427, last log term = 2, snapshot index = 19427, snapshot term = 2, journal size = 0 2025-10-15T01:13:27,350 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-inventory-config , received role change from null to Follower 2025-10-15T01:13:27,351 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-inventory-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2025-10-15T01:13:27,351 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-inventory-config from null to Follower 2025-10-15T01:13:31,625 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | CandidateRegistryInit | 185 - org.opendaylight.controller.eos-dom-akka - 11.0.2 | member-2 : Initial removal of candidates from previous iteration failed. Rescheduling. java.util.concurrent.TimeoutException: Ask timed out on [Actor[pekko://opendaylight-cluster-data/system/singletonProxyOwnerSupervisor-no-dc#-296399453]] after [5000 ms]. Message of type [org.opendaylight.controller.eos.akka.owner.supervisor.command.ClearCandidatesForMember]. A typical reason for `AskTimeoutException` is that the recipient actor didn't send a reply. at org.apache.pekko.actor.typed.scaladsl.AskPattern$.$anonfun$onTimeout$1(AskPattern.scala:141) ~[bundleFile:?] at org.apache.pekko.pattern.PromiseActorRef$.$anonfun$apply$1(AskSupport.scala:737) ~[bundleFile:?] at org.apache.pekko.actor.Scheduler$$anon$7.run(Scheduler.scala:491) ~[bundleFile:?] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(LightArrayRevolverScheduler.scala:384) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.executeBucket$1(LightArrayRevolverScheduler.scala:332) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.nextTick(LightArrayRevolverScheduler.scala:336) ~[bundleFile:?] at org.apache.pekko.actor.LightArrayRevolverScheduler$$anon$3.run(LightArrayRevolverScheduler.scala:288) ~[bundleFile:?] at java.lang.Thread.run(Thread.java:1583) ~[?:?] 2025-10-15T01:13:34,294 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Received InitJoinAck message from [Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/system/cluster/core/daemon#-1588781870]] to [pekko://opendaylight-cluster-data@10.30.170.142:2550] 2025-10-15T01:13:34,352 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Welcome from [pekko://opendaylight-cluster-data@10.30.170.193:2550] 2025-10-15T01:13:34,360 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.170.193:2550 2025-10-15T01:13:34,360 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-default-operational 2025-10-15T01:13:34,360 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-topology-operational 2025-10-15T01:13:34,361 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Peer address for peer member-1-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-default-operational 2025-10-15T01:13:34,361 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-inventory-operational 2025-10-15T01:13:34,361 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Peer address for peer member-1-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-topology-operational 2025-10-15T01:13:34,361 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-toaster-operational 2025-10-15T01:13:34,361 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Peer address for peer member-1-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-inventory-operational 2025-10-15T01:13:34,361 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational: Peer address for peer member-1-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-toaster-operational 2025-10-15T01:13:34,359 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.170.193:2550 2025-10-15T01:13:34,362 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-default-config 2025-10-15T01:13:34,362 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#-473961370] was unhandled. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:13:34,363 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#-21216662] was unhandled. [2] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:13:34,362 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-topology-config 2025-10-15T01:13:34,363 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config 2025-10-15T01:13:34,363 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config: Peer address for peer member-1-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-topology-config 2025-10-15T01:13:34,363 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-toaster-config 2025-10-15T01:13:34,363 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config: Peer address for peer member-1-shard-default-config set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-default-config 2025-10-15T01:13:34,363 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Peer address for peer member-1-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config 2025-10-15T01:13:34,363 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config: Peer address for peer member-1-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-toaster-config 2025-10-15T01:13:34,395 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClusterSingletonProxy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Singleton identified at [pekko://opendaylight-cluster-data@10.30.170.193:2550/system/singletonManagerOwnerSupervisor/OwnerSupervisor] 2025-10-15T01:13:34,450 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | EmptyLocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.eos.akka.owner.supervisor.command.ClearCandidatesResponse] to Actor[pekko://opendaylight-cluster-data/temp/$a] was not delivered. [3] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/temp/$a] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:13:35,109 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.142:2550 2025-10-15T01:13:35,109 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.142:2550 2025-10-15T01:13:35,109 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-config/member-2-shard-default-config 2025-10-15T01:13:35,110 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-operational/member-2-shard-default-operational 2025-10-15T01:13:35,110 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-config/member-2-shard-topology-config 2025-10-15T01:13:35,110 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-operational/member-2-shard-topology-operational 2025-10-15T01:13:35,110 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-config/member-2-shard-inventory-config 2025-10-15T01:13:35,110 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2025-10-15T01:13:35,110 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2025-10-15T01:13:35,110 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-2-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.142:2550/user/shardmanager-config/member-2-shard-toaster-config 2025-10-15T01:13:35,112 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | This node is now the leader responsible for taking SBR decisions among the reachable nodes (more leaders may exist). 2025-10-15T01:13:35,118 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | ClusterSingletonManager state change [Start -> Younger] 2025-10-15T01:13:35,267 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-306099532]] to [pekko://opendaylight-cluster-data@10.30.170.142:2550] 2025-10-15T01:13:35,268 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.170.142:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-306099532]] (version [1.0.3]) 2025-10-15T01:13:35,678 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClusterGossip | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Ignoring received gossip from unknown [UniqueAddress(pekko://opendaylight-cluster-data@10.30.170.126:2550,4378045957425099581)] 2025-10-15T01:13:35,843 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#-473961370] was unhandled. [4] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:13:35,844 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#-21216662] was unhandled. [5] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:13:36,115 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - is the new leader among reachable nodes (more leaders may exist) 2025-10-15T01:13:36,122 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.170.126:2550] to [Up] 2025-10-15T01:13:36,125 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.170.126:2550 2025-10-15T01:13:36,125 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | This node is not the leader any more and not responsible for taking SBR decisions. 2025-10-15T01:13:36,125 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-15T01:13:36,125 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.170.126:2550 2025-10-15T01:13:36,125 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-15T01:13:36,126 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-15T01:13:36,126 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-15T01:13:36,126 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-15T01:13:36,126 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-15T01:13:36,126 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-15T01:13:36,126 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Peer address for peer member-3-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-15T01:13:36,126 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config: Peer address for peer member-3-shard-default-config set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-15T01:13:36,126 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-15T01:13:36,127 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Peer address for peer member-3-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-15T01:13:36,127 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config: Peer address for peer member-3-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-15T01:13:36,127 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Peer address for peer member-3-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-15T01:13:36,127 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational: Peer address for peer member-3-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-15T01:13:36,127 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Peer address for peer member-3-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-15T01:13:36,127 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config: Peer address for peer member-3-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-15T01:13:36,527 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational (Follower): Term 3 in "RequestVote{term=3, candidateId=member-1-shard-toaster-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 2 - updating term 2025-10-15T01:13:36,527 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config (Follower): Term 3 in "RequestVote{term=3, candidateId=member-1-shard-toaster-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 2 - updating term 2025-10-15T01:13:36,528 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Follower): Term 3 in "RequestVote{term=3, candidateId=member-1-shard-default-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 2 - updating term 2025-10-15T01:13:36,534 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config (Follower): Term 3 in "RequestVote{term=3, candidateId=member-1-shard-topology-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 2 - updating term 2025-10-15T01:13:36,558 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-2-shard-toaster-config status sync done false 2025-10-15T01:13:36,559 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@6af444b5 2025-10-15T01:13:36,559 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@6aa53647 2025-10-15T01:13:36,559 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@e1405ea 2025-10-15T01:13:36,560 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-2-shard-topology-config status sync done false 2025-10-15T01:13:36,561 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@26303620 2025-10-15T01:13:36,561 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-2-shard-toaster-operational status sync done false 2025-10-15T01:13:36,562 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-2-shard-default-operational status sync done false 2025-10-15T01:13:36,563 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Follower): Term 3 in "RequestVote{term=3, candidateId=member-1-shard-topology-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 2 - updating term 2025-10-15T01:13:36,571 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-2-shard-topology-operational status sync done false 2025-10-15T01:13:36,572 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@b74738f 2025-10-15T01:13:36,575 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Follower): Term 3 in "RequestVote{term=3, candidateId=member-1-shard-inventory-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 2 - updating term 2025-10-15T01:13:36,585 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@78fb1092 2025-10-15T01:13:36,585 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: All Shards are ready - data store operational is ready 2025-10-15T01:13:36,585 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-2-shard-inventory-operational status sync done false 2025-10-15T01:13:36,589 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | OSGiDOMStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Datastore service type OPERATIONAL activated 2025-10-15T01:13:36,590 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Distributed Datastore type OPERATIONAL started 2025-10-15T01:13:36,635 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config (Follower): Term 3 in "RequestVote{term=3, candidateId=member-1-shard-default-config, lastLogIndex=76, lastLogTerm=2}" message is greater than follower's term 2 - updating term 2025-10-15T01:13:36,645 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@10f39d30 2025-10-15T01:13:36,646 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-2-shard-default-config status sync done false 2025-10-15T01:13:36,652 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-2-shard-default-config status sync done true 2025-10-15T01:13:37,092 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-2-shard-topology-operational status sync done true 2025-10-15T01:13:37,100 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-2-shard-inventory-operational status sync done true 2025-10-15T01:13:37,135 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - is no longer leader 2025-10-15T01:13:37,326 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config (Follower): Term 3 in "RequestVote{term=3, candidateId=member-1-shard-inventory-config, lastLogIndex=20021, lastLogTerm=2}" message is greater than follower's term 2 - updating term 2025-10-15T01:13:37,340 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@209c485f 2025-10-15T01:13:37,340 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: All Shards are ready - data store config is ready 2025-10-15T01:13:37,341 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-2-shard-inventory-config status sync done false 2025-10-15T01:13:37,341 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-2-shard-inventory-config status sync done true 2025-10-15T01:13:37,343 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OSGiDOMStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Datastore service type CONFIGURATION activated 2025-10-15T01:13:37,363 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OSGiClusterAdmin | 193 - org.opendaylight.controller.sal-cluster-admin-impl - 11.0.2 | Cluster Admin services started 2025-10-15T01:13:37,372 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ConcurrentDOMDataBroker | 358 - org.opendaylight.yangtools.util - 14.0.17 | ThreadFactory created: CommitFutures 2025-10-15T01:13:37,374 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | DataBrokerCommitExecutor | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | DOM Data Broker commit exector started 2025-10-15T01:13:37,376 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ConcurrentDOMDataBroker | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | DOM Data Broker started 2025-10-15T01:13:37,381 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService)] 2025-10-15T01:13:37,383 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | AbstractAdaptedService | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding/DOM adapter for DataBroker activated 2025-10-15T01:13:37,444 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-default-config#1161464405], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2025-10-15T01:13:37,446 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=0} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-default-config#1161464405], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2025-10-15T01:13:37,458 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 is waiting for dependencies [Initial app config AaaCertServiceConfig] 2025-10-15T01:13:37,464 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OSGiPasswordServiceConfigBootstrap | 170 - org.opendaylight.aaa.password-service-impl - 0.21.2 | Listening for password service configuration 2025-10-15T01:13:37,465 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.api.IIDMStore), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth)] 2025-10-15T01:13:37,472 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=0} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-default-config#1161464405], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 26.74 ms 2025-10-15T01:13:37,475 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), Initial app config ShiroConfiguration, (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth)] 2025-10-15T01:13:37,477 | ERROR | opendaylight-cluster-data-notification-dispatcher-49 | H2Store | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.2 | bundle org.opendaylight.aaa.idm-store-h2:0.21.2 (167)[org.opendaylight.aaa.datastore.h2.H2Store(5)] : Constructor argument 0 in class class org.opendaylight.aaa.datastore.h2.H2Store has unsupported type org.opendaylight.aaa.datastore.h2.ConnectionProvider 2025-10-15T01:13:37,480 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | DefaultPasswordHashService | 170 - org.opendaylight.aaa.password-service-impl - 0.21.2 | DefaultPasswordHashService will utilize default iteration count=20000 2025-10-15T01:13:37,481 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | DefaultPasswordHashService | 170 - org.opendaylight.aaa.password-service-impl - 0.21.2 | DefaultPasswordHashService will utilize default algorithm=SHA-512 2025-10-15T01:13:37,481 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | DefaultPasswordHashService | 170 - org.opendaylight.aaa.password-service-impl - 0.21.2 | DefaultPasswordHashService will not utilize a private salt, since none was configured 2025-10-15T01:13:37,500 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | H2Store | 167 - org.opendaylight.aaa.idm-store-h2 - 0.21.2 | H2 IDMStore activated 2025-10-15T01:13:37,502 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config DatastoreConfig, Initial app config ShiroConfiguration, (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth)] 2025-10-15T01:13:37,507 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config DatastoreConfig, Initial app config ShiroConfiguration] 2025-10-15T01:13:37,521 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | EOSClusterSingletonServiceProvider | 257 - org.opendaylight.mdsal.mdsal-singleton-impl - 14.0.18 | Cluster Singleton Service started 2025-10-15T01:13:37,540 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | YangLibraryWriterSingleton | 291 - org.opendaylight.netconf.yanglib-mdsal-writer - 9.0.1 | ietf-yang-library writer registered 2025-10-15T01:13:37,585 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-2-shard-toaster-config status sync done true 2025-10-15T01:13:37,585 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-2-shard-default-operational status sync done true 2025-10-15T01:13:37,585 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-2-shard-topology-config status sync done true 2025-10-15T01:13:37,585 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-2-shard-toaster-operational status sync done true 2025-10-15T01:13:37,588 | INFO | opendaylight-cluster-data-notification-dispatcher-50 | AAAEncryptionServiceImpl | 165 - org.opendaylight.aaa.encrypt-service-impl - 0.21.2 | AAAEncryptionService activated 2025-10-15T01:13:37,597 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config DatastoreConfig] 2025-10-15T01:13:37,592 | INFO | opendaylight-cluster-data-notification-dispatcher-50 | OSGiEncryptionServiceConfigurator | 165 - org.opendaylight.aaa.encrypt-service-impl - 0.21.2 | Encryption Service enabled 2025-10-15T01:13:37,602 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager)] 2025-10-15T01:13:37,625 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-default-operational#1480478886], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2025-10-15T01:13:37,625 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=0} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-default-operational#1480478886], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2025-10-15T01:13:37,630 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=0} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-default-operational#1480478886], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 4.880 ms 2025-10-15T01:13:37,640 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ArbitratorReconciliationManagerImpl | 296 - org.opendaylight.openflowplugin.applications.arbitratorreconciliation-impl - 0.20.1 | ArbitratorReconciliationManager has started successfully. 2025-10-15T01:13:37,682 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-10-15T01:13:37,685 | INFO | Blueprint Extender: 3 | AaaCertMdsalProvider | 163 - org.opendaylight.aaa.cert - 0.21.2 | AaaCertMdsalProvider Initialized 2025-10-15T01:13:37,709 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-15T01:13:37,713 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-10-15T01:13:37,744 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | DeviceOwnershipService started 2025-10-15T01:13:37,761 | INFO | Blueprint Extender: 3 | LazyBindingList | 325 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.17 | Using lazy population for lists larger than 16 element(s) 2025-10-15T01:13:37,826 | INFO | Blueprint Extender: 2 | LLDPSpeaker | 300 - org.opendaylight.openflowplugin.applications.lldp-speaker - 0.20.1 | LLDPSpeaker started, it will send LLDP frames each 5 seconds 2025-10-15T01:13:37,850 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | DefaultConfigPusher | 301 - org.opendaylight.openflowplugin.applications.of-switch-config-pusher - 0.20.1 | DefaultConfigPusher has started. 2025-10-15T01:13:37,852 | INFO | Blueprint Extender: 3 | CertificateManagerService | 163 - org.opendaylight.aaa.cert - 0.21.2 | Certificate Manager service has been initialized 2025-10-15T01:13:37,860 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2025-10-15T01:13:37,864 | INFO | Blueprint Extender: 3 | CertificateManagerService | 163 - org.opendaylight.aaa.cert - 0.21.2 | AaaCert Rpc Service has been initialized 2025-10-15T01:13:37,866 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.21.2 has been started 2025-10-15T01:13:37,867 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.aaa.cert_0.21.2 [163] was successfully created 2025-10-15T01:13:37,880 | INFO | Blueprint Extender: 2 | NodeConnectorInventoryEventTranslator | 300 - org.opendaylight.openflowplugin.applications.lldp-speaker - 0.20.1 | NodeConnectorInventoryEventTranslator has started. 2025-10-15T01:13:37,881 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.20.1 has been started 2025-10-15T01:13:37,882 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.lldp-speaker_0.20.1 [300] was successfully created 2025-10-15T01:13:37,902 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2025-10-15T01:13:37,905 | INFO | Blueprint Extender: 1 | StoreBuilder | 162 - org.opendaylight.aaa.authn-api - 0.21.2 | Checking if default entries must be created in IDM store 2025-10-15T01:13:37,923 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | FlowCapableTopologyProvider | 304 - org.opendaylight.openflowplugin.applications.topology-manager - 0.20.1 | Topology Manager service started. 2025-10-15T01:13:38,006 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Loading properties from '(urn:opendaylight:params:xml:ns:yang:openflow:provider:config?revision=2016-05-10)openflow-provider-config' YANG file 2025-10-15T01:13:38,010 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | rpc-requests-quota configuration property was changed to '20000' 2025-10-15T01:13:38,010 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | global-notification-quota configuration property was changed to '64000' 2025-10-15T01:13:38,010 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | switch-features-mandatory configuration property was changed to 'false' 2025-10-15T01:13:38,010 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | enable-flow-removed-notification configuration property was changed to 'true' 2025-10-15T01:13:38,010 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-statistics-rpc-enabled configuration property was changed to 'false' 2025-10-15T01:13:38,010 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | barrier-count-limit configuration property was changed to '25600' 2025-10-15T01:13:38,010 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | barrier-interval-timeout-limit configuration property was changed to '500' 2025-10-15T01:13:38,011 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | echo-reply-timeout configuration property was changed to '2000' 2025-10-15T01:13:38,011 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-statistics-polling-on configuration property was changed to 'true' 2025-10-15T01:13:38,011 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-table-statistics-polling-on configuration property was changed to 'true' 2025-10-15T01:13:38,011 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-flow-statistics-polling-on configuration property was changed to 'true' 2025-10-15T01:13:38,011 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-group-statistics-polling-on configuration property was changed to 'true' 2025-10-15T01:13:38,011 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-meter-statistics-polling-on configuration property was changed to 'true' 2025-10-15T01:13:38,011 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-port-statistics-polling-on configuration property was changed to 'true' 2025-10-15T01:13:38,011 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | is-queue-statistics-polling-on configuration property was changed to 'true' 2025-10-15T01:13:38,011 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | skip-table-features configuration property was changed to 'true' 2025-10-15T01:13:38,011 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | basic-timer-delay configuration property was changed to '3000' 2025-10-15T01:13:38,011 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | maximum-timer-delay configuration property was changed to '900000' 2025-10-15T01:13:38,012 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | use-single-layer-serialization configuration property was changed to 'true' 2025-10-15T01:13:38,012 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | thread-pool-min-threads configuration property was changed to '1' 2025-10-15T01:13:38,012 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | thread-pool-max-threads configuration property was changed to '32000' 2025-10-15T01:13:38,012 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | thread-pool-timeout configuration property was changed to '60' 2025-10-15T01:13:38,012 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | device-connection-rate-limit-per-min configuration property was changed to '0' 2025-10-15T01:13:38,013 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | device-connection-hold-time-in-seconds configuration property was changed to '0' 2025-10-15T01:13:38,013 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | device-datastore-removal-delay configuration property was changed to '500' 2025-10-15T01:13:38,013 | INFO | Blueprint Extender: 2 | OSGiConfigurationServiceFactory | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Loading configuration from 'org.opendaylight.openflowplugin' configuration file 2025-10-15T01:13:38,018 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OSGiSwitchConnectionProviders | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | MD-SAL configuration-based SwitchConnectionProviders started 2025-10-15T01:13:38,021 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | OSGiSwitchConnectionProviders | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Starting instance of type 'openflow-switch-connection-provider-default-impl' 2025-10-15T01:13:38,023 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | felix.fileinstall.filename configuration property was changed to 'file:/tmp/karaf-0.23.0/etc/org.opendaylight.openflowplugin.cfg' 2025-10-15T01:13:38,024 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | service.pid configuration property was changed to 'org.opendaylight.openflowplugin' 2025-10-15T01:13:38,112 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | CommandExtension | 120 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.openflowplugin.srm-shell/0.20.1 2025-10-15T01:13:38,124 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | org.opendaylight.openflowplugin.applications.frm.impl.ForwardingRulesManagerImpl@478a6528 was registered as configuration listener to OpenFlowPlugin configuration service 2025-10-15T01:13:38,152 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Checking presence of configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-default-impl}] 2025-10-15T01:13:38,155 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Checking presence of configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-legacy-impl}] 2025-10-15T01:13:38,156 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OSGiDistributedDataStore | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Distributed Datastore type CONFIGURATION started 2025-10-15T01:13:38,158 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-default-impl}] already present 2025-10-15T01:13:38,159 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OSGiFactorySwitchConnectionConfiguration | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-legacy-impl}] already present 2025-10-15T01:13:38,252 | INFO | Blueprint Extender: 1 | StoreBuilder | 162 - org.opendaylight.aaa.authn-api - 0.21.2 | Found default domain in IDM store, skipping insertion of default data 2025-10-15T01:13:38,253 | INFO | Blueprint Extender: 1 | AAAShiroProvider | 172 - org.opendaylight.aaa.shiro - 0.21.2 | AAAShiroProvider Session Initiated 2025-10-15T01:13:38,257 | INFO | opendaylight-cluster-data-notification-dispatcher-49 | OSGiSwitchConnectionProviders | 316 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.20.1 | Starting instance of type 'openflow-switch-connection-provider-legacy-impl' 2025-10-15T01:13:38,262 | INFO | Blueprint Extender: 3 | ForwardingRulesManagerImpl | 299 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.20.1 | ForwardingRulesManager has started successfully. 2025-10-15T01:13:38,263 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.20.1 has been started 2025-10-15T01:13:38,264 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager_0.20.1 [299] was successfully created 2025-10-15T01:13:38,278 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | org.opendaylight.openflowplugin.applications.topology.lldp.LLDPLinkAger@1872197d was registered as configuration listener to OpenFlowPlugin configuration service 2025-10-15T01:13:38,306 | INFO | Blueprint Extender: 3 | LLDPActivator | 303 - org.opendaylight.openflowplugin.applications.topology-lldp-discovery - 0.20.1 | Starting LLDPActivator with lldpSecureKey: aa9251f8-c7c0-4322-b8d6-c3a84593bda3 2025-10-15T01:13:38,307 | INFO | Blueprint Extender: 3 | LLDPActivator | 303 - org.opendaylight.openflowplugin.applications.topology-lldp-discovery - 0.20.1 | LLDPDiscoveryListener started. 2025-10-15T01:13:38,308 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.20.1 has been started 2025-10-15T01:13:38,308 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery_0.20.1 [303] was successfully created 2025-10-15T01:13:38,428 | INFO | Blueprint Extender: 1 | IniSecurityManagerFactory | 171 - org.opendaylight.aaa.repackaged-shiro - 0.21.2 | Realms have been explicitly set on the SecurityManager instance - auto-setting of realms will not occur. 2025-10-15T01:13:38,466 | INFO | paxweb-config-3-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-11,contextPath='/auth'} 2025-10-15T01:13:38,466 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=298, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}}", size=2} 2025-10-15T01:13:38,466 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-11,contextPath='/auth'} 2025-10-15T01:13:38,467 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=298, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@1d923648{/auth,null,STOPPED} 2025-10-15T01:13:38,468 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@1d923648{/auth,null,STOPPED} 2025-10-15T01:13:38,480 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-12,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2025-10-15T01:13:38,480 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-inventory-operational#667273823], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent} 2025-10-15T01:13:38,480 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=1} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-inventory-operational#667273823], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-15T01:13:38,481 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-12,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]}", size=2} 2025-10-15T01:13:38,481 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2025-10-15T01:13:38,481 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=1} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-inventory-operational#667273823], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} in 837.5 μs 2025-10-15T01:13:38,482 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/auth" with default Osgi Context OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=298, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}} 2025-10-15T01:13:38,484 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Initializing CustomFilterAdapter 2025-10-15T01:13:38,485 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Injecting a new filter chain with 0 Filters: 2025-10-15T01:13:38,485 | INFO | paxweb-config-3-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@1d923648{/auth,null,AVAILABLE} 2025-10-15T01:13:38,485 | INFO | paxweb-config-3-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=298, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=172, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}}} as OSGi service for "/auth" context path 2025-10-15T01:13:38,486 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-15T01:13:38,487 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-14,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*, /moon/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2025-10-15T01:13:38,487 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-14,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*, /moon/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]}", size=2} 2025-10-15T01:13:38,487 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2025-10-15T01:13:38,487 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-15T01:13:38,490 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-15,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2025-10-15T01:13:38,490 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-15,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]}", size=1} 2025-10-15T01:13:38,490 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-15,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2025-10-15T01:13:38,491 | INFO | Blueprint Extender: 1 | WhiteboardWebServer | 176 - org.opendaylight.aaa.web.osgi-impl - 0.21.2 | Bundle org.opendaylight.aaa.shiro_0.21.2 [172] registered context path /auth with 4 service(s) 2025-10-15T01:13:38,506 | ERROR | Blueprint Extender: 1 | MdsalRestconfServer | 279 - org.opendaylight.netconf.restconf-server-mdsal - 9.0.1 | bundle org.opendaylight.netconf.restconf-server-mdsal:9.0.1 (279)[org.opendaylight.restconf.server.mdsal.MdsalRestconfServer(69)] : Constructor argument 5 in class class org.opendaylight.restconf.server.mdsal.MdsalRestconfServer has unsupported type [Lorg.opendaylight.restconf.server.spi.RpcImplementation; 2025-10-15T01:13:38,510 | INFO | Blueprint Extender: 2 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | OpenFlowPluginProvider started, waiting for onSystemBootReady() 2025-10-15T01:13:38,511 | INFO | Blueprint Extender: 2 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Added connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@397c4363 2025-10-15T01:13:38,511 | INFO | Blueprint Extender: 2 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Added connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@495a015a 2025-10-15T01:13:38,520 | INFO | Blueprint Extender: 2 | OnfExtensionProvider | 308 - org.opendaylight.openflowplugin.extension-onf - 0.20.1 | ONF Extension Provider started. 2025-10-15T01:13:38,521 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.20.1 has been started 2025-10-15T01:13:38,521 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.openflowplugin.impl_0.20.1 [309] was successfully created 2025-10-15T01:13:38,550 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-15T01:13:38,550 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-15T01:13:38,567 | INFO | Blueprint Extender: 1 | StoppableHttpServiceFactory | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.opendaylight.netconf.restconf-server-jaxrs_9.0.1 [278]] 2025-10-15T01:13:38,568 | INFO | paxweb-config-3-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-18,contextPath='/rests'} 2025-10-15T01:13:38,568 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=311, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}}", size=2} 2025-10-15T01:13:38,568 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-18,contextPath='/rests'} 2025-10-15T01:13:38,568 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=311, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@76dc33b1{/rests,null,STOPPED} 2025-10-15T01:13:38,569 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@76dc33b1{/rests,null,STOPPED} 2025-10-15T01:13:38,570 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-19,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2025-10-15T01:13:38,570 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-19,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]}", size=2} 2025-10-15T01:13:38,571 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2025-10-15T01:13:38,571 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2025-10-15T01:13:38,571 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/rests" with default Osgi Context OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=311, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}} 2025-10-15T01:13:38,571 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Initializing CustomFilterAdapter 2025-10-15T01:13:38,572 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 166 - org.opendaylight.aaa.filterchain - 0.21.2 | Injecting a new filter chain with 0 Filters: 2025-10-15T01:13:38,572 | INFO | paxweb-config-3-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@76dc33b1{/rests,null,AVAILABLE} 2025-10-15T01:13:38,572 | INFO | paxweb-config-3-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=311, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}}} as OSGi service for "/rests" context path 2025-10-15T01:13:38,572 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-15T01:13:38,573 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-21,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2025-10-15T01:13:38,573 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-21,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]}", size=2} 2025-10-15T01:13:38,574 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2025-10-15T01:13:38,574 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2025-10-15T01:13:38,574 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-15T01:13:38,575 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-22,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2025-10-15T01:13:38,575 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-22,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]}", size=1} 2025-10-15T01:13:38,575 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-22,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2025-10-15T01:13:38,575 | INFO | Blueprint Extender: 1 | WhiteboardWebServer | 176 - org.opendaylight.aaa.web.osgi-impl - 0.21.2 | Bundle org.opendaylight.netconf.restconf-server-jaxrs_9.0.1 [278] registered context path /rests with 4 service(s) 2025-10-15T01:13:38,576 | INFO | paxweb-config-3-thread-1 | ServerModel | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-25,contextPath='/.well-known'} 2025-10-15T01:13:38,576 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-23,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=316, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}}", size=2} 2025-10-15T01:13:38,576 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-25,contextPath='/.well-known'} 2025-10-15T01:13:38,577 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-23,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=316, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@36ff75e7{/.well-known,null,STOPPED} 2025-10-15T01:13:38,577 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@36ff75e7{/.well-known,null,STOPPED} 2025-10-15T01:13:38,578 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-26,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-23,WellKnownURIs,/.well-known}]} 2025-10-15T01:13:38,578 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-26,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-23,WellKnownURIs,/.well-known}]}", size=2} 2025-10-15T01:13:38,578 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2025-10-15T01:13:38,579 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2025-10-15T01:13:38,579 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /.well-known 2025-10-15T01:13:38,579 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/.well-known" with default Osgi Context OsgiContextModel{WB,id=OCM-23,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=316, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}} 2025-10-15T01:13:38,579 | INFO | paxweb-config-3-thread-1 | ContextHandler | 139 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@36ff75e7{/.well-known,null,AVAILABLE} 2025-10-15T01:13:38,580 | INFO | paxweb-config-3-thread-1 | OsgiServletContext | 397 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-23,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=316, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}}} as OSGi service for "/.well-known" context path 2025-10-15T01:13:38,580 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2025-10-15T01:13:38,581 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 396 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-28,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-23,WellKnownURIs,/.well-known}]} 2025-10-15T01:13:38,581 | INFO | paxweb-config-3-thread-1 | JettyServerController | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-28,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-23,WellKnownURIs,/.well-known}]}", size=1} 2025-10-15T01:13:38,581 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 394 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-28,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-23,WellKnownURIs,/.well-known}]} 2025-10-15T01:13:38,581 | INFO | Blueprint Extender: 1 | WhiteboardWebServer | 176 - org.opendaylight.aaa.web.osgi-impl - 0.21.2 | Bundle org.opendaylight.netconf.restconf-server-jaxrs_9.0.1 [278] registered context path /.well-known with 3 service(s) 2025-10-15T01:13:38,582 | INFO | Blueprint Extender: 1 | YangLibraryWriterSingleton | 291 - org.opendaylight.netconf.yanglib-mdsal-writer - 9.0.1 | Binding URL provider org.opendaylight.restconf.server.jaxrs.JaxRsYangLibrary@33234eaf 2025-10-15T01:13:38,605 | INFO | Blueprint Extender: 1 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv4AddressNoZone 2025-10-15T01:13:38,605 | INFO | Blueprint Extender: 1 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv4Prefix 2025-10-15T01:13:38,606 | INFO | Blueprint Extender: 1 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv6AddressNoZone 2025-10-15T01:13:38,606 | INFO | Blueprint Extender: 1 | StringValueObjectFactory | 331 - org.opendaylight.yangtools.binding-reflect - 14.0.17 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv6Prefix 2025-10-15T01:13:38,629 | INFO | Blueprint Extender: 1 | RestconfTransportChannelListener | 276 - org.opendaylight.netconf.restconf-server - 9.0.1 | Initialized with service class org.opendaylight.restconf.server.mdsal.MdsalRestconfServer 2025-10-15T01:13:38,630 | INFO | Blueprint Extender: 1 | RestconfTransportChannelListener | 276 - org.opendaylight.netconf.restconf-server - 9.0.1 | Initialized with base path: /restconf, default encoding: JSON, default pretty print: false 2025-10-15T01:13:38,665 | INFO | Blueprint Extender: 1 | OSGiNorthbound | 275 - org.opendaylight.netconf.restconf-nb - 9.0.1 | Global RESTCONF northbound pools started 2025-10-15T01:13:38,666 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 79 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.21.2 has been started 2025-10-15T01:13:38,667 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 11.0.2 | Blueprint container for bundle org.opendaylight.aaa.shiro_0.21.2 [172] was successfully created 2025-10-15T01:13:39,643 | INFO | SystemReadyService-0 | KarafSystemReady | 202 - org.opendaylight.infrautils.ready-impl - 7.1.7 | checkBundleDiagInfos: Elapsed time 17s, remaining time 282s, diag: Active {INSTALLED=0, RESOLVED=10, UNKNOWN=0, GRACE_PERIOD=0, WAITING=0, STARTING=0, ACTIVE=397, STOPPING=0, FAILURE=0} 2025-10-15T01:13:39,643 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.7 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 2025-10-15T01:13:39,643 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.7 | Now notifying all its registered SystemReadyListeners... 2025-10-15T01:13:39,643 | INFO | SystemReadyService-0 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | onSystemBootReady() received, starting the switch connections 2025-10-15T01:13:39,773 | INFO | epollEventLoopGroup-2-1 | TcpServerFacade | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Switch listener started and ready to accept incoming TCP/TLS connections on /[0:0:0:0:0:0:0:0]:6653 2025-10-15T01:13:39,773 | INFO | epollEventLoopGroup-4-1 | TcpServerFacade | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Switch listener started and ready to accept incoming TCP/TLS connections on /[0:0:0:0:0:0:0:0]:6633 2025-10-15T01:13:39,774 | INFO | epollEventLoopGroup-4-1 | SwitchConnectionProviderImpl | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Started TCP connection on /[0:0:0:0:0:0:0:0]:6633 2025-10-15T01:13:39,774 | INFO | epollEventLoopGroup-2-1 | SwitchConnectionProviderImpl | 318 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.20.1 | Started TCP connection on /[0:0:0:0:0:0:0:0]:6653 2025-10-15T01:13:39,774 | INFO | epollEventLoopGroup-4-1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@495a015a started 2025-10-15T01:13:39,774 | INFO | epollEventLoopGroup-2-1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@397c4363 started 2025-10-15T01:13:39,775 | INFO | epollEventLoopGroup-2-1 | OpenFlowPluginProviderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | All switchConnectionProviders are up and running (2). 2025-10-15T01:13:48,316 | INFO | qtp119977020-492 | AuthenticationManager | 174 - org.opendaylight.aaa.tokenauthrealm - 0.21.2 | Authentication is now enabled 2025-10-15T01:13:48,316 | INFO | qtp119977020-492 | AuthenticationManager | 174 - org.opendaylight.aaa.tokenauthrealm - 0.21.2 | Authentication Manager activated 2025-10-15T01:13:49,778 | INFO | qtp119977020-490 | JaxRsRestconf | 278 - org.opendaylight.netconf.restconf-server-jaxrs - 9.0.1 | RESTCONF data-missing condition is reported as HTTP status 409 (RFC8040) 2025-10-15T01:13:49,782 | INFO | qtp119977020-490 | JaxRsRestconf | 278 - org.opendaylight.netconf.restconf-server-jaxrs - 9.0.1 | RESTCONF data-missing condition is reported as HTTP status 409 (RFC8040) 2025-10-15T01:13:50,010 | INFO | qtp119977020-490 | ApiPathParser | 273 - org.opendaylight.netconf.restconf-api - 9.0.1 | Consecutive slashes in REST URLs will be rejected 2025-10-15T01:13:53,903 | INFO | sshd-SshServer[5cc2fe0e](port=8101)-nio2-thread-2 | ServerSessionImpl | 125 - org.apache.sshd.osgi - 2.15.0 | Session karaf@/10.30.170.242:49724 authenticated 2025-10-15T01:13:54,472 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Cluster Restart 2025-10-15T01:19:47,112 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Follower Node1 2025-10-15T01:19:47,656 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Cluster Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Cluster Restart 2025-10-15T01:19:48,147 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node1 2025-10-15T01:19:48,607 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Follower Node1" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Follower Node1 2025-10-15T01:19:49,020 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster 2025-10-15T01:19:49,457 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Leader Before Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Leader Before Leader Restart 2025-10-15T01:19:50,634 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Leader" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Leader 2025-10-15T01:19:53,690 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-15T01:19:53,690 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-15T01:19:53,785 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-15T01:19:53,785 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-15T01:19:53,819 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Leader" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Leader 2025-10-15T01:19:54,216 | WARN | node-cleaner-0 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Seems like device is still owned by other controller instance. Skip deleting openflow:1 node from operational datastore. 2025-10-15T01:19:54,287 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster Before Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster Before Leader Restart 2025-10-15T01:19:54,756 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Leader Restart 2025-10-15T01:19:55,141 | INFO | opendaylight-cluster-data-notification-dispatcher-54 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Clearing the device connection timer for the device 1 2025-10-15T01:21:35,537 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill Leader From Cluster Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill Leader From Cluster Node 2025-10-15T01:21:35,832 | INFO | pipe-log:log "ROBOT MESSAGE: Killing ODL1 10.30.170.193" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Killing ODL1 10.30.170.193 2025-10-15T01:21:36,615 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.193:2550], message stream] Upstream failed, cause: StreamTcpException: The connection closed with error: Connection reset 2025-10-15T01:21:40,933 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Marking node as UNREACHABLE [Member(pekko://opendaylight-cluster-data@10.30.170.193:2550, Up)]. 2025-10-15T01:21:40,938 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received UnreachableMember: memberName MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.170.193:2550 2025-10-15T01:21:40,939 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received UnreachableMember: memberName MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.170.193:2550 2025-10-15T01:21:40,939 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-default-operational#1480478886], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-default-operational#1480478886], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2025-10-15T01:21:40,940 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-default-config#1161464405], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-default-config#1161464405], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2025-10-15T01:21:40,942 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: refreshing backend for shard 0 2025-10-15T01:21:40,942 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: refreshing backend for shard 0 2025-10-15T01:21:40,942 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-inventory-operational#667273823], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-inventory-operational#667273823], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2025-10-15T01:21:40,943 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: refreshing backend for shard 1 2025-10-15T01:21:44,014 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.193:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.193/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:21:44,037 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.remote.rpc.registry.gossip.GossipStatus] from Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#-473961370] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [6] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:21:44,039 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#1281690984] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [7] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:21:44,040 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#1281690984] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [8] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:21:44,040 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#1281690984] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [9] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:21:44,042 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#1281690984] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [10] dead letters encountered, no more dead letters will be logged in next [5.000 min]. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:21:46,095 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.193:2550 is unreachable 2025-10-15T01:21:46,095 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.193:2550 is unreachable 2025-10-15T01:21:46,104 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config (Candidate): Starting new election term 4 2025-10-15T01:21:46,104 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Candidate): Starting new election term 4 2025-10-15T01:21:46,105 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config (Follower) :- Switching from behavior Follower to Candidate, election term: 4 2025-10-15T01:21:46,105 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 4 2025-10-15T01:21:46,105 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-inventory-config , received role change from Follower to Candidate 2025-10-15T01:21:46,105 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-inventory-operational , received role change from Follower to Candidate 2025-10-15T01:21:46,106 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@69f58965 2025-10-15T01:21:46,106 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@48540e93 2025-10-15T01:21:46,106 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.193:2550 is unreachable 2025-10-15T01:21:46,106 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-inventory-operational from Follower to Candidate 2025-10-15T01:21:46,106 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-inventory-config from Follower to Candidate 2025-10-15T01:21:46,109 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Candidate): Starting new election term 4 2025-10-15T01:21:46,109 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 4 2025-10-15T01:21:46,109 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-default-operational , received role change from Follower to Candidate 2025-10-15T01:21:46,109 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@29b28fec 2025-10-15T01:21:46,110 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-default-operational from Follower to Candidate 2025-10-15T01:21:46,115 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational (Follower): Term 4 in "RequestVote{term=4, candidateId=member-3-shard-toaster-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 3 - updating term 2025-10-15T01:21:46,118 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 4 2025-10-15T01:21:46,119 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@2ebebf53 2025-10-15T01:21:46,120 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Candidate) :- Switching from behavior Candidate to Leader, election term: 4 2025-10-15T01:21:46,120 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-default-operational , received role change from Candidate to Leader 2025-10-15T01:21:46,120 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@7180524b 2025-10-15T01:21:46,120 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-inventory-config , received role change from Candidate to Leader 2025-10-15T01:21:46,121 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-default-operational from Candidate to Leader 2025-10-15T01:21:46,121 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-inventory-config from Candidate to Leader 2025-10-15T01:21:46,125 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.193:2550 is unreachable 2025-10-15T01:21:46,126 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-2-shard-default-operational#876125843], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present} 2025-10-15T01:21:46,127 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-default-operational#1480478886], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-2-shard-default-operational#876125843], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} 2025-10-15T01:21:46,127 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config (Candidate): Starting new election term 4 2025-10-15T01:21:46,128 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config (Follower) :- Switching from behavior Follower to Candidate, election term: 4 2025-10-15T01:21:46,128 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-topology-config , received role change from Follower to Candidate 2025-10-15T01:21:46,128 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@4ff2df82 2025-10-15T01:21:46,129 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-topology-config from Follower to Candidate 2025-10-15T01:21:46,130 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-default-operational#1480478886], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-2-shard-default-operational#876125843], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} in 2.678 ms 2025-10-15T01:21:46,131 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-2-shard-toaster-operational status sync done false 2025-10-15T01:21:46,131 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@2199aa1e 2025-10-15T01:21:46,135 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.193:2550 is unreachable 2025-10-15T01:21:46,137 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 4 2025-10-15T01:21:46,137 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config (Candidate): Starting new election term 4 2025-10-15T01:21:46,137 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-topology-config , received role change from Candidate to Leader 2025-10-15T01:21:46,138 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config (Follower) :- Switching from behavior Follower to Candidate, election term: 4 2025-10-15T01:21:46,138 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-toaster-config , received role change from Follower to Candidate 2025-10-15T01:21:46,138 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@3c5b52fb 2025-10-15T01:21:46,138 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-topology-config from Candidate to Leader 2025-10-15T01:21:46,138 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@7b15588b 2025-10-15T01:21:46,139 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-toaster-config from Follower to Candidate 2025-10-15T01:21:46,143 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 4 2025-10-15T01:21:46,144 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@4e988a0 2025-10-15T01:21:46,144 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-toaster-config , received role change from Candidate to Leader 2025-10-15T01:21:46,144 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-toaster-config from Candidate to Leader 2025-10-15T01:21:46,285 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.193:2550 is unreachable 2025-10-15T01:21:46,290 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Candidate): Starting new election term 4 2025-10-15T01:21:46,291 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 4 2025-10-15T01:21:46,291 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@79af4a1c 2025-10-15T01:21:46,291 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-topology-operational , received role change from Follower to Candidate 2025-10-15T01:21:46,292 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-topology-operational from Follower to Candidate 2025-10-15T01:21:46,299 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Candidate) :- Switching from behavior Candidate to Leader, election term: 4 2025-10-15T01:21:46,299 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@76f3a8e6 2025-10-15T01:21:46,299 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-topology-operational , received role change from Candidate to Leader 2025-10-15T01:21:46,300 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-topology-operational from Candidate to Leader 2025-10-15T01:21:46,304 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.193:2550 is unreachable 2025-10-15T01:21:46,308 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config (Candidate): Starting new election term 4 2025-10-15T01:21:46,309 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config (Follower) :- Switching from behavior Follower to Candidate, election term: 4 2025-10-15T01:21:46,309 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-default-config , received role change from Follower to Candidate 2025-10-15T01:21:46,309 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@12c7fe83 2025-10-15T01:21:46,309 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-default-config from Follower to Candidate 2025-10-15T01:21:46,645 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational Received follower initial sync status for member-2-shard-toaster-operational status sync done true 2025-10-15T01:21:49,753 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Leader Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Leader Node 2025-10-15T01:21:49,919 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberRemoved: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.170.193:2550 2025-10-15T01:21:49,920 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberRemoved: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.170.193:2550 2025-10-15T01:21:49,923 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Younger observed OldestChanged: [Some(pekko://opendaylight-cluster-data@10.30.170.193:2550) -> myself] 2025-10-15T01:21:49,924 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | ClusterSingletonManager state change [Younger -> BecomingOldest] 2025-10-15T01:21:49,927 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Association | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Association to [pekko://opendaylight-cluster-data@10.30.170.193:2550] with UID [6407633197060589234] is irrecoverably failed. UID is now quarantined and all messages to this UID will be delivered to dead letters. Remote ActorSystem must be restarted to recover from this situation. Reason: Cluster member removed, previous status [Down] 2025-10-15T01:21:50,938 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Retry [1], sending HandOverToMe to [Some(pekko://opendaylight-cluster-data@10.30.170.193:2550)] 2025-10-15T01:21:51,504 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.193:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.193/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:21:51,954 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Retry [2], sending HandOverToMe to [Some(pekko://opendaylight-cluster-data@10.30.170.193:2550)] 2025-10-15T01:21:52,388 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Restart Leader from Cluster Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Restart Leader from Cluster Node 2025-10-15T01:21:52,620 | INFO | pipe-log:log "ROBOT MESSAGE: Starting ODL1 10.30.170.193" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting ODL1 10.30.170.193 2025-10-15T01:21:52,975 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Retry [3], sending HandOverToMe to [Some(pekko://opendaylight-cluster-data@10.30.170.193:2550)] 2025-10-15T01:21:53,394 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.193:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.193/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:21:53,913 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.193:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.193/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:21:53,995 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Retry [4], sending HandOverToMe to [Some(pekko://opendaylight-cluster-data@10.30.170.193:2550)] 2025-10-15T01:21:54,433 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.193:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.193/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:21:55,015 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Retry [5], sending HandOverToMe to [Some(pekko://opendaylight-cluster-data@10.30.170.193:2550)] 2025-10-15T01:21:55,471 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.193:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.193/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:21:55,990 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.193:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.193/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:21:56,034 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Retry [6], sending HandOverToMe to [Some(pekko://opendaylight-cluster-data@10.30.170.193:2550)] 2025-10-15T01:21:56,179 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Candidate): Starting new election term 5 2025-10-15T01:21:56,187 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Candidate) :- Switching from behavior Candidate to Leader, election term: 5 2025-10-15T01:21:56,188 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@4721f63 2025-10-15T01:21:56,188 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-inventory-operational , received role change from Candidate to Leader 2025-10-15T01:21:56,190 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-inventory-operational from Candidate to Leader 2025-10-15T01:21:56,191 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: All Shards are ready - data store operational is ready 2025-10-15T01:21:56,192 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-2-shard-inventory-operational#-57150662], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-10-15T01:21:56,193 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-inventory-operational#667273823], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-2-shard-inventory-operational#-57150662], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-10-15T01:21:56,194 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-inventory-operational#667273823], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-2-shard-inventory-operational#-57150662], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 725.4 μs 2025-10-15T01:21:56,245 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: resolved shard 2 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-2-shard-topology-operational#1544123826], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=2, shard=topology, dataTree=present} 2025-10-15T01:21:56,245 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=2} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-2-shard-topology-operational#1544123826], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=2, shard=topology, dataTree=present}} 2025-10-15T01:21:56,247 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=2} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-2-shard-topology-operational#1544123826], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=2, shard=topology, dataTree=present}} in 1.859 ms 2025-10-15T01:21:56,367 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config (Candidate): Term 5 in "RequestVote{term=5, candidateId=member-3-shard-default-config, lastLogIndex=142, lastLogTerm=3}" message is greater than Candidate's term 4 - switching to Follower 2025-10-15T01:21:56,373 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 5 2025-10-15T01:21:56,373 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-default-config , received role change from Candidate to Follower 2025-10-15T01:21:56,374 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-default-config from Candidate to Follower 2025-10-15T01:21:56,374 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@3b835c79 2025-10-15T01:21:56,375 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: All Shards are ready - data store config is ready 2025-10-15T01:21:56,375 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-2-shard-default-config status sync done false 2025-10-15T01:21:56,377 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config Received follower initial sync status for member-2-shard-default-config status sync done true 2025-10-15T01:21:56,381 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-default-config#-547291084], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2025-10-15T01:21:56,381 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-default-config#1161464405], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-default-config#-547291084], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2025-10-15T01:21:56,382 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-default-config#1161464405], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-default-config#-547291084], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 474.9 μs 2025-10-15T01:21:56,512 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.193:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.193/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:21:56,936 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Member removed [pekko://opendaylight-cluster-data@10.30.170.193:2550], previous oldest [pekko://opendaylight-cluster-data@10.30.170.193:2550] 2025-10-15T01:21:56,937 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Singleton manager starting singleton actor [pekko://opendaylight-cluster-data/system/singletonManagerOwnerSupervisor/OwnerSupervisor] 2025-10-15T01:21:56,938 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | ClusterSingletonManager state change [BecomingOldest -> Oldest] 2025-10-15T01:21:57,030 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.193:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.193/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:21:57,055 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ClusterSingletonProxy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Singleton identified at [pekko://opendaylight-cluster-data/system/singletonManagerOwnerSupervisor/OwnerSupervisor] 2025-10-15T01:21:57,064 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2025-10-15T01:21:57,195 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | YangLibraryWriter | 291 - org.opendaylight.netconf.yanglib-mdsal-writer - 9.0.1 | ietf-yang-library writer started with modules-state enabled 2025-10-15T01:21:57,550 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.193:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.193/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:21:58,982 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/system/cluster/core/daemon/firstSeedNodeProcess-1#-1451504216]] to [pekko://opendaylight-cluster-data@10.30.170.142:2550] 2025-10-15T01:21:58,983 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.170.142:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/system/cluster/core/daemon/firstSeedNodeProcess-1#-1451504216]] (version [1.0.3]) 2025-10-15T01:21:59,060 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Node [pekko://opendaylight-cluster-data@10.30.170.193:2550] is JOINING, roles [member-1, dc-default], version [0.0.0] 2025-10-15T01:21:59,065 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : LOCAL_OWNERSHIP_RETAINED_WITH_NO_CHANGE [wasOwner=true, isOwner=true, hasOwner=true] 2025-10-15T01:22:00,107 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.170.193:2550 2025-10-15T01:22:00,108 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.170.193:2550 2025-10-15T01:22:00,108 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-default-config 2025-10-15T01:22:00,108 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-default-operational 2025-10-15T01:22:00,109 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-topology-operational 2025-10-15T01:22:00,109 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config: Peer address for peer member-1-shard-default-config set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-default-config 2025-10-15T01:22:00,109 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Peer address for peer member-1-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-default-operational 2025-10-15T01:22:00,109 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-inventory-operational 2025-10-15T01:22:00,109 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Peer address for peer member-1-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-topology-operational 2025-10-15T01:22:00,109 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-toaster-operational 2025-10-15T01:22:00,109 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Peer address for peer member-1-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-inventory-operational 2025-10-15T01:22:00,110 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: All Shards are ready - data store operational is ready 2025-10-15T01:22:00,110 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational: Peer address for peer member-1-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-toaster-operational 2025-10-15T01:22:00,109 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-topology-config 2025-10-15T01:22:00,110 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config: Peer address for peer member-1-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-topology-config 2025-10-15T01:22:00,111 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config 2025-10-15T01:22:00,111 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Peer address for peer member-1-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config 2025-10-15T01:22:00,111 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-toaster-config 2025-10-15T01:22:00,112 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config: Peer address for peer member-1-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-toaster-config 2025-10-15T01:22:00,112 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: All Shards are ready - data store config is ready 2025-10-15T01:22:03,754 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-15T01:22:03,754 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-15T01:22:03,803 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=4, success=false, followerId=member-1-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 38, snapshotTerm: 3, replicatedToAllIndex: -1 2025-10-15T01:22:03,804 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Leader): follower member-1-shard-default-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-15T01:22:03,804 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Leader): Initiating install snapshot to follower member-1-shard-default-operational: follower nextIndex: 0, leader snapshotIndex: 38, leader lastIndex: 42, leader log size: 4 2025-10-15T01:22:03,813 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=42, lastAppliedTerm=4, lastIndex=42, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-1-shard-default-operational 2025-10-15T01:22:03,825 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Persising snapshot at EntryInfo[index=42, term=4]/EntryInfo[index=42, term=4] 2025-10-15T01:22:03,825 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 38 and term: 3 2025-10-15T01:22:03,830 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: snapshot is durable as of 2025-10-15T01:22:03.825362537Z 2025-10-15T01:22:03,986 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=5, success=false, followerId=member-1-shard-inventory-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 263, snapshotTerm: 3, replicatedToAllIndex: -1 2025-10-15T01:22:03,986 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=4, success=false, followerId=member-1-shard-topology-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 4, snapshotTerm: 3, replicatedToAllIndex: -1 2025-10-15T01:22:03,986 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Leader): follower member-1-shard-inventory-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-15T01:22:03,986 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Leader): Initiating install snapshot to follower member-1-shard-inventory-operational: follower nextIndex: 0, leader snapshotIndex: 263, leader lastIndex: 264, leader log size: 1 2025-10-15T01:22:03,986 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Leader): follower member-1-shard-topology-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-15T01:22:03,986 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=264, lastAppliedTerm=3, lastIndex=264, lastTerm=3, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-1-shard-inventory-operational 2025-10-15T01:22:03,987 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Leader): Initiating install snapshot to follower member-1-shard-topology-operational: follower nextIndex: 0, leader snapshotIndex: 4, leader lastIndex: 8, leader log size: 4 2025-10-15T01:22:03,987 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=8, lastAppliedTerm=4, lastIndex=8, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-1-shard-topology-operational 2025-10-15T01:22:03,989 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Persising snapshot at EntryInfo[index=8, term=4]/EntryInfo[index=8, term=4] 2025-10-15T01:22:03,989 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 4 and term: 3 2025-10-15T01:22:03,994 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: snapshot is durable as of 2025-10-15T01:22:03.989683500Z 2025-10-15T01:22:04,012 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Leader): Snapshot successfully installed on follower member-1-shard-topology-operational (last chunk 1) - matchIndex set to 8, nextIndex set to 9 2025-10-15T01:22:04,043 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Leader): Snapshot successfully installed on follower member-1-shard-default-operational (last chunk 1) - matchIndex set to 42, nextIndex set to 43 2025-10-15T01:22:04,176 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Persising snapshot at EntryInfo[index=264, term=3]/EntryInfo[index=264, term=3] 2025-10-15T01:22:04,177 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 263 and term: 3 2025-10-15T01:22:04,182 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: snapshot is durable as of 2025-10-15T01:22:04.176915879Z 2025-10-15T01:22:04,263 | INFO | node-cleaner-0 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Try to remove device openflow:1 from operational DS 2025-10-15T01:22:04,552 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Leader): Snapshot successfully installed on follower member-1-shard-inventory-operational (last chunk 3) - matchIndex set to 264, nextIndex set to 265 2025-10-15T01:22:05,109 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-default-operational: retiring state Enabled{clientId=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, nanosAgo=18988688936, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=2} 2025-10-15T01:22:05,848 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-operational: retiring state Enabled{clientId=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, nanosAgo=9659109142, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=2} 2025-10-15T01:22:16,624 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Leader Restart 2025-10-15T01:28:08,446 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Leader" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Leader 2025-10-15T01:28:11,633 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Leader Restart 2025-10-15T01:28:11,986 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-15T01:28:12,265 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-15T01:28:12,322 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, nanosAgo=386201926974, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2} 2025-10-15T01:28:13,558 | INFO | opendaylight-cluster-data-notification-dispatcher-63 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Clearing the device connection timer for the device 1 2025-10-15T01:29:52,468 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Leader Node After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Leader Node After Leader Restart 2025-10-15T01:29:53,034 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-15T01:29:53,235 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-15T01:29:53,742 | INFO | node-cleaner-1 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Try to remove device openflow:1 from operational DS 2025-10-15T01:29:55,435 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Leader Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Leader Node 2025-10-15T01:29:55,897 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster After Leader Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster After Leader Restart 2025-10-15T01:29:56,349 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Follower Before follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Follower Before follower Restart 2025-10-15T01:29:57,507 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Follower Node2 2025-10-15T01:30:00,235 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-15T01:30:00,363 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Follower Node2 2025-10-15T01:30:00,395 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-15T01:30:00,567 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=0}, nanosAgo=494446144070, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1} 2025-10-15T01:30:00,863 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster Before Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster Before Follower Restart 2025-10-15T01:30:01,302 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Follower Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Follower Restart 2025-10-15T01:30:01,827 | INFO | opendaylight-cluster-data-notification-dispatcher-66 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Clearing the device connection timer for the device 1 2025-10-15T01:31:41,735 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill Follower Node2 2025-10-15T01:31:42,191 | INFO | pipe-log:log "ROBOT MESSAGE: Killing ODL3 10.30.170.126" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Killing ODL3 10.30.170.126 2025-10-15T01:31:46,288 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node2 and Exit" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node2 and Exit 2025-10-15T01:31:47,416 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Marking node as UNREACHABLE [Member(pekko://opendaylight-cluster-data@10.30.170.126:2550, Up)]. 2025-10-15T01:31:47,416 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received UnreachableMember: memberName MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.170.126:2550 2025-10-15T01:31:47,418 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | This node is now the leader responsible for taking SBR decisions among the reachable nodes (more leaders may exist). 2025-10-15T01:31:47,418 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received UnreachableMember: memberName MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.170.126:2550 2025-10-15T01:31:47,420 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-default-config#-547291084], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-default-config#-547291084], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2025-10-15T01:31:47,420 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: refreshing backend for shard 0 2025-10-15T01:31:47,755 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-15T01:31:48,087 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-topology-operational: retiring state Enabled{clientId=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, nanosAgo=601787817283, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=2} 2025-10-15T01:31:48,285 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - is the new leader among reachable nodes (more leaders may exist) 2025-10-15T01:31:50,464 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.126:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.126/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:31:50,485 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-2-shard-default-operational#876125843] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [191] dead letters encountered, of which 180 were not logged. The counter will be reset now. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:31:50,486 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-topology-config#-931617513] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [1] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:31:50,486 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-toaster-config#813139965] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [2] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:31:50,487 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [3] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:31:50,487 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#1281690984] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [4] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:31:50,487 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.remote.rpc.registry.gossip.GossipStatus] from Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#-473961370] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [5] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:31:50,488 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.remote.rpc.registry.gossip.GossipStatus] from Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#-21216662] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [6] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:31:50,488 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.GossipStatus] from Actor[pekko://opendaylight-cluster-data/system/cluster/core/daemon#244308484] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [7] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:31:50,488 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-2-shard-topology-operational#1544123826] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [8] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:31:50,489 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#1281690984] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [9] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:31:50,489 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-2-shard-inventory-operational#-57150662] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [10] dead letters encountered, no more dead letters will be logged in next [5.000 min]. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2025-10-15T01:31:52,635 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.126:2550 is unreachable 2025-10-15T01:31:52,640 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config (Candidate): Starting new election term 6 2025-10-15T01:31:52,640 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config (Follower) :- Switching from behavior Follower to Candidate, election term: 6 2025-10-15T01:31:52,641 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@71bd0b56 2025-10-15T01:31:52,641 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-default-config , received role change from Follower to Candidate 2025-10-15T01:31:52,642 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-default-config from Follower to Candidate 2025-10-15T01:31:52,657 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 6 2025-10-15T01:31:52,658 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@7a59f297 2025-10-15T01:31:52,658 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-default-config , received role change from Candidate to Leader 2025-10-15T01:31:52,659 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-default-config from Candidate to Leader 2025-10-15T01:31:52,659 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: All Shards are ready - data store config is ready 2025-10-15T01:31:52,661 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-default-config#-384780360], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present} 2025-10-15T01:31:52,661 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-default-config#-547291084], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-default-config#-384780360], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} 2025-10-15T01:31:52,663 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-default-config#-547291084], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-default-config#-384780360], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} in 808.2 μs 2025-10-15T01:31:52,906 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.126:2550 is unreachable 2025-10-15T01:31:52,912 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational (Candidate): Starting new election term 5 2025-10-15T01:31:52,912 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 5 2025-10-15T01:31:52,913 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@36c5bf7f 2025-10-15T01:31:52,913 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-toaster-operational , received role change from Follower to Candidate 2025-10-15T01:31:52,914 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-toaster-operational from Follower to Candidate 2025-10-15T01:31:52,923 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational (Candidate) :- Switching from behavior Candidate to Leader, election term: 5 2025-10-15T01:31:52,924 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@2055febb 2025-10-15T01:31:52,924 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-toaster-operational , received role change from Candidate to Leader 2025-10-15T01:31:52,924 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received role changed for member-2-shard-toaster-operational from Candidate to Leader 2025-10-15T01:31:52,925 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: All Shards are ready - data store operational is ready 2025-10-15T01:31:55,363 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | SBR took decision DownUnreachable and is downing [pekko://opendaylight-cluster-data@10.30.170.126:2550], [1] unreachable of [3] members, all members in DC [Member(pekko://opendaylight-cluster-data@10.30.170.126:2550, Up), Member(pekko://opendaylight-cluster-data@10.30.170.142:2550, Up), Member(pekko://opendaylight-cluster-data@10.30.170.193:2550, Up)], full reachability status: [pekko://opendaylight-cluster-data@10.30.170.142:2550 -> pekko://opendaylight-cluster-data@10.30.170.126:2550: Unreachable [Unreachable] (2), pekko://opendaylight-cluster-data@10.30.170.193:2550 -> pekko://opendaylight-cluster-data@10.30.170.126:2550: Unreachable [Unreachable] (1)] 2025-10-15T01:31:55,363 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | SBR is downing [UniqueAddress(pekko://opendaylight-cluster-data@10.30.170.126:2550,4378045957425099581)] 2025-10-15T01:31:55,365 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Marking unreachable node [pekko://opendaylight-cluster-data@10.30.170.126:2550] as [Down] 2025-10-15T01:31:55,366 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | SBR found unreachable members changed during stable-after period. Resetting timer. Now 1 unreachable members found. Downing decision will not be made before 2025-10-15T01:32:02.365866986Z. 2025-10-15T01:31:55,385 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-15T01:31:56,448 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Leader is removing unreachable node [pekko://opendaylight-cluster-data@10.30.170.126:2550] 2025-10-15T01:31:56,449 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberRemoved: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.170.126:2550 2025-10-15T01:31:56,449 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberRemoved: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.170.126:2550 2025-10-15T01:31:56,450 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Association | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Association to [pekko://opendaylight-cluster-data@10.30.170.126:2550] with UID [4378045957425099581] is irrecoverably failed. UID is now quarantined and all messages to this UID will be delivered to dead letters. Remote ActorSystem must be restarted to recover from this situation. Reason: Cluster member removed, previous status [Down] 2025-10-15T01:31:58,113 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.126:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.126/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:31:58,913 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Restart Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Restart Follower Node2 2025-10-15T01:31:59,067 | INFO | pipe-log:log "ROBOT MESSAGE: Starting ODL3 10.30.170.126" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting ODL3 10.30.170.126 2025-10-15T01:32:00,464 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.126:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.126/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:32:01,504 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.126:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.126/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:32:02,025 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.126:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.126/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:32:02,545 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.126:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.126/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:32:03,065 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.126:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.126/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:32:04,105 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.126:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.126/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2025-10-15T01:32:04,674 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#1364721512]] to [pekko://opendaylight-cluster-data@10.30.170.142:2550] 2025-10-15T01:32:04,675 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.170.142:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#1364721512]] (version [1.0.3]) 2025-10-15T01:32:04,737 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Node [pekko://opendaylight-cluster-data@10.30.170.126:2550] is JOINING, roles [member-3, dc-default], version [0.0.0] 2025-10-15T01:32:05,625 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.170.126:2550] to [Up] 2025-10-15T01:32:05,627 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | This node is not the leader any more and not responsible for taking SBR decisions. 2025-10-15T01:32:05,627 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.170.126:2550 2025-10-15T01:32:05,627 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.170.126:2550 2025-10-15T01:32:05,627 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-15T01:32:05,627 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-15T01:32:05,627 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-15T01:32:05,628 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-15T01:32:05,628 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config: Peer address for peer member-3-shard-default-config set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-15T01:32:05,628 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Peer address for peer member-3-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-15T01:32:05,628 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Peer address for peer member-3-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-15T01:32:05,628 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config: Peer address for peer member-3-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-15T01:32:05,629 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-15T01:32:05,628 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-15T01:32:05,630 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Peer address for peer member-3-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-15T01:32:05,630 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: All Shards are ready - data store config is ready 2025-10-15T01:32:05,630 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-15T01:32:05,630 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config: Peer address for peer member-3-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-15T01:32:05,630 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Peer address for peer member-3-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-15T01:32:05,630 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-15T01:32:05,630 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational: Peer address for peer member-3-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-15T01:32:05,630 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-operational: All Shards are ready - data store operational is ready 2025-10-15T01:32:06,645 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - is no longer leader 2025-10-15T01:32:08,995 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-15T01:32:08,995 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-15T01:32:09,133 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=false, followerId=member-3-shard-topology-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 26626, lastApplied : 19, commitIndex : 19 2025-10-15T01:32:09,134 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=4, success=false, followerId=member-3-shard-topology-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 18, snapshotTerm: 4, replicatedToAllIndex: 18 2025-10-15T01:32:09,134 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Leader): follower member-3-shard-topology-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-15T01:32:09,135 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Leader): Initiating install snapshot to follower member-3-shard-topology-operational: follower nextIndex: 0, leader snapshotIndex: 18, leader lastIndex: 19, leader log size: 1 2025-10-15T01:32:09,135 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=19, lastAppliedTerm=4, lastIndex=19, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-3-shard-topology-operational 2025-10-15T01:32:09,137 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Persising snapshot at EntryInfo[index=19, term=4]/EntryInfo[index=19, term=4] 2025-10-15T01:32:09,138 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 18 and term: 4 2025-10-15T01:32:09,140 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=4, success=false, followerId=member-3-shard-topology-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 18, snapshotTerm: 4, replicatedToAllIndex: 18 2025-10-15T01:32:09,140 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Leader): follower member-3-shard-topology-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-15T01:32:09,143 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: snapshot is durable as of 2025-10-15T01:32:09.138457829Z 2025-10-15T01:32:09,178 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=5, success=false, followerId=member-3-shard-inventory-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 26522, lastApplied : 766, commitIndex : 766 2025-10-15T01:32:09,178 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=5, success=false, followerId=member-3-shard-inventory-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 765, snapshotTerm: 5, replicatedToAllIndex: 765 2025-10-15T01:32:09,179 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Leader): follower member-3-shard-inventory-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-15T01:32:09,179 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Leader): Initiating install snapshot to follower member-3-shard-inventory-operational: follower nextIndex: 0, leader snapshotIndex: 765, leader lastIndex: 766, leader log size: 1 2025-10-15T01:32:09,179 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=766, lastAppliedTerm=5, lastIndex=766, lastTerm=5, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-3-shard-inventory-operational 2025-10-15T01:32:09,183 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=5, success=false, followerId=member-3-shard-inventory-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 765, snapshotTerm: 5, replicatedToAllIndex: 765 2025-10-15T01:32:09,184 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Leader): follower member-3-shard-inventory-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-15T01:32:09,189 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational (Leader): Snapshot successfully installed on follower member-3-shard-topology-operational (last chunk 1) - matchIndex set to 19, nextIndex set to 20 2025-10-15T01:32:09,230 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Persising snapshot at EntryInfo[index=766, term=5]/EntryInfo[index=766, term=5] 2025-10-15T01:32:09,230 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 765 and term: 5 2025-10-15T01:32:09,235 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: snapshot is durable as of 2025-10-15T01:32:09.230739945Z 2025-10-15T01:32:09,321 | WARN | opendaylight-cluster-data-shard-dispatcher-42 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=true, followerId=member-3-shard-topology-config, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 26524, lastApplied : -1, commitIndex : -1 2025-10-15T01:32:09,321 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=true, followerId=member-3-shard-toaster-config, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 26524, lastApplied : -1, commitIndex : -1 2025-10-15T01:32:09,322 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=false, followerId=member-3-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 26524, lastApplied : 59, commitIndex : 59 2025-10-15T01:32:09,322 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=4, success=false, followerId=member-3-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 58, snapshotTerm: 4, replicatedToAllIndex: 58 2025-10-15T01:32:09,323 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Leader): follower member-3-shard-default-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-15T01:32:09,323 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Leader): Initiating install snapshot to follower member-3-shard-default-operational: follower nextIndex: 0, leader snapshotIndex: 58, leader lastIndex: 59, leader log size: 1 2025-10-15T01:32:09,323 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=59, lastAppliedTerm=4, lastIndex=59, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-3-shard-default-operational 2025-10-15T01:32:09,326 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Persising snapshot at EntryInfo[index=59, term=4]/EntryInfo[index=59, term=4] 2025-10-15T01:32:09,327 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 58 and term: 4 2025-10-15T01:32:09,331 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: snapshot is durable as of 2025-10-15T01:32:09.327177301Z 2025-10-15T01:32:09,369 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=4, success=false, followerId=member-3-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 58, snapshotTerm: 4, replicatedToAllIndex: 58 2025-10-15T01:32:09,369 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Leader): follower member-3-shard-default-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2025-10-15T01:32:09,375 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational (Leader): Snapshot successfully installed on follower member-3-shard-default-operational (last chunk 1) - matchIndex set to 59, nextIndex set to 60 2025-10-15T01:32:09,502 | INFO | node-cleaner-0 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Try to remove device openflow:1 from operational DS 2025-10-15T01:32:09,703 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational (Leader): Snapshot successfully installed on follower member-3-shard-inventory-operational (last chunk 3) - matchIndex set to 766, nextIndex set to 767 2025-10-15T01:32:09,995 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=true, followerId=member-3-shard-inventory-config, logLastIndex=20034, logLastTerm=4, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 27198, lastApplied : 20034, commitIndex : 20034 2025-10-15T01:32:10,087 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-default-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, nanosAgo=17428767031, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2} 2025-10-15T01:32:10,230 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-default-operational: retiring state Enabled{clientId=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, nanosAgo=132970729658, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=2} 2025-10-15T01:32:10,960 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-operational: retiring state Enabled{clientId=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, nanosAgo=28971578566, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=2} 2025-10-15T01:32:11,219 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-default-operational: Store Tx member-3-datastore-operational-fe-2-txn-3-0: Conflicting modification for path /(urn:ietf:params:xml:ns:yang:ietf-subscribed-notifications?revision=2019-09-09)streams/stream/stream[{(urn:ietf:params:xml:ns:yang:ietf-subscribed-notifications?revision=2019-09-09)name=NETCONF}]. 2025-10-15T01:32:22,178 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Follower Node2 Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Follower Node2 Restart 2025-10-15T01:38:14,650 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Follower Node2 2025-10-15T01:38:17,685 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-15T01:38:17,801 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Follower Node2 Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Follower Node2 Restart 2025-10-15T01:38:17,885 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2025-10-15T01:38:17,992 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=1}, nanosAgo=495822793117, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2} 2025-10-15T01:38:18,997 | INFO | opendaylight-cluster-data-notification-dispatcher-138 | ConnectionManagerImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Clearing the device connection timer for the device 1 2025-10-15T01:39:58,510 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node2 2025-10-15T01:40:00,174 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | DeviceOwnershipServiceImpl | 298 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.20.1 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-15T01:40:00,174 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2025-10-15T01:40:00,681 | INFO | node-cleaner-2 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Try to remove device openflow:1 from operational DS 2025-10-15T01:40:02,447 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Follower Node 2" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Follower Node 2 2025-10-15T01:40:02,884 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster After Follower Node2 Restart" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster After Follower Node2 Restart 2025-10-15T01:40:05,381 | INFO | sshd-SshServer[5cc2fe0e](port=8101)-nio2-thread-1 | ServerSessionImpl | 125 - org.apache.sshd.osgi - 2.15.0 | Session karaf@/10.30.170.242:45800 authenticated 2025-10-15T01:40:06,071 | INFO | pipe-log:log "ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/040__Cluster_Current_Term_Verification_3Node_Cluster.robot" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-titanium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/040__Cluster_Current_Term_Verification_3Node_Cluster.robot 2025-10-15T01:40:06,458 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Check Shard And Get Inventory" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Check Shard And Get Inventory 2025-10-15T01:40:11,276 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Initial Current Term Verification" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Initial Current Term Verification 2025-10-15T01:40:11,678 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Add Bulk Flow From Follower" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Add Bulk Flow From Follower 2025-10-15T01:40:13,266 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Get Bulk Flows And Verify In Cluster" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Get Bulk Flows And Verify In Cluster 2025-10-15T01:40:13,467 | INFO | qtp119977020-526 | StaticConfiguration | 204 - org.opendaylight.mdsal.binding-dom-adapter - 14.0.18 | Binding-over-DOM codec shortcuts are enabled 2025-10-15T01:40:13,497 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, nanosAgo=1107376276618, purgedHistories=MutableUnsignedLongSet{span=[5..5], size=1}}, outdated by request from client ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1} 2025-10-15T01:40:13,498 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-10-15T01:40:13,498 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-10-15T01:40:13,499 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 1.062 ms 2025-10-15T01:40:26,734 | INFO | ForkJoinPool-10-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Total Flows read: 10000 2025-10-15T01:40:27,535 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Verification After Adding Bulk Flow" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Verification After Adding Bulk Flow 2025-10-15T01:40:27,891 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=39323, lastAppliedTerm=4, lastIndex=39999, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=720, mandatoryTrim=false] 2025-10-15T01:40:27,894 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Persising snapshot at EntryInfo[index=39323, term=4]/EntryInfo[index=39999, term=4] 2025-10-15T01:40:27,894 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 39322 and term: 4 2025-10-15T01:40:27,927 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: snapshot is durable as of 2025-10-15T01:40:27.894674866Z 2025-10-15T01:40:28,024 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Comparison Before And After Addition Of Flow" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Comparison Before And After Addition Of Flow 2025-10-15T01:40:28,473 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Delete and Add ten percent of the flows for 5 iterations" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Delete and Add ten percent of the flows for 5 iterations 2025-10-15T01:40:41,748 | INFO | ForkJoinPool-10-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Total Flows read: 9000 2025-10-15T01:40:43,140 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=59747, lastAppliedTerm=4, lastIndex=59999, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=305, mandatoryTrim=false] 2025-10-15T01:40:43,141 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Persising snapshot at EntryInfo[index=59747, term=4]/EntryInfo[index=59999, term=4] 2025-10-15T01:40:43,143 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 59745 and term: 4 2025-10-15T01:40:43,173 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: snapshot is durable as of 2025-10-15T01:40:43.142133214Z 2025-10-15T01:40:56,471 | INFO | ForkJoinPool-10-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Total Flows read: 10000 2025-10-15T01:40:58,522 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=79998, lastAppliedTerm=4, lastIndex=79999, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=614, mandatoryTrim=false] 2025-10-15T01:40:58,523 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Persising snapshot at EntryInfo[index=79998, term=4]/EntryInfo[index=79999, term=4] 2025-10-15T01:40:58,523 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 79411 and term: 4 2025-10-15T01:40:58,559 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: snapshot is durable as of 2025-10-15T01:40:58.523731091Z 2025-10-15T01:41:12,275 | INFO | ForkJoinPool-10-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Total Flows read: 9000 2025-10-15T01:41:13,258 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=99998, lastAppliedTerm=4, lastIndex=99999, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=62, mandatoryTrim=false] 2025-10-15T01:41:13,259 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Persising snapshot at EntryInfo[index=99998, term=4]/EntryInfo[index=99999, term=4] 2025-10-15T01:41:13,260 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 99535 and term: 4 2025-10-15T01:41:13,286 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: snapshot is durable as of 2025-10-15T01:41:13.259771872Z 2025-10-15T01:41:27,667 | INFO | ForkJoinPool-10-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Total Flows read: 10000 2025-10-15T01:41:31,812 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=119998, lastAppliedTerm=4, lastIndex=119999, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=462, mandatoryTrim=false] 2025-10-15T01:41:31,812 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Persising snapshot at EntryInfo[index=119998, term=4]/EntryInfo[index=119999, term=4] 2025-10-15T01:41:31,813 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 119916 and term: 4 2025-10-15T01:41:31,845 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: snapshot is durable as of 2025-10-15T01:41:31.812955172Z 2025-10-15T01:41:45,135 | INFO | ForkJoinPool-10-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Total Flows read: 9000 2025-10-15T01:41:47,043 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=139998, lastAppliedTerm=4, lastIndex=139999, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=74, mandatoryTrim=false] 2025-10-15T01:41:47,045 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Persising snapshot at EntryInfo[index=139998, term=4]/EntryInfo[index=139999, term=4] 2025-10-15T01:41:47,046 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 139747 and term: 4 2025-10-15T01:41:47,080 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: snapshot is durable as of 2025-10-15T01:41:47.045832401Z 2025-10-15T01:42:02,518 | INFO | ForkJoinPool-10-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Total Flows read: 10000 2025-10-15T01:42:02,557 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=159998, lastAppliedTerm=4, lastIndex=159999, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=40, mandatoryTrim=false] 2025-10-15T01:42:02,558 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Persising snapshot at EntryInfo[index=159998, term=4]/EntryInfo[index=159999, term=4] 2025-10-15T01:42:02,558 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 159563 and term: 4 2025-10-15T01:42:02,613 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: snapshot is durable as of 2025-10-15T01:42:02.558226152Z 2025-10-15T01:42:16,431 | INFO | ForkJoinPool-10-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Total Flows read: 9000 2025-10-15T01:42:21,249 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=179998, lastAppliedTerm=4, lastIndex=179999, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=804, mandatoryTrim=false] 2025-10-15T01:42:21,249 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Persising snapshot at EntryInfo[index=179998, term=4]/EntryInfo[index=179999, term=4] 2025-10-15T01:42:21,250 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 179900 and term: 4 2025-10-15T01:42:21,292 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: snapshot is durable as of 2025-10-15T01:42:21.250130943Z 2025-10-15T01:42:33,871 | INFO | ForkJoinPool-10-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Total Flows read: 10000 2025-10-15T01:42:36,182 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=199998, lastAppliedTerm=4, lastIndex=199999, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=599, mandatoryTrim=false] 2025-10-15T01:42:36,186 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Persising snapshot at EntryInfo[index=199998, term=4]/EntryInfo[index=199999, term=4] 2025-10-15T01:42:36,187 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 199594 and term: 4 2025-10-15T01:42:36,219 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: snapshot is durable as of 2025-10-15T01:42:36.186899167Z 2025-10-15T01:42:48,134 | INFO | ForkJoinPool-10-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Total Flows read: 9000 2025-10-15T01:42:50,842 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=219934, lastAppliedTerm=4, lastIndex=219999, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=708, mandatoryTrim=false] 2025-10-15T01:42:50,844 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Persising snapshot at EntryInfo[index=219934, term=4]/EntryInfo[index=219999, term=4] 2025-10-15T01:42:50,844 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 219623 and term: 4 2025-10-15T01:42:50,907 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: snapshot is durable as of 2025-10-15T01:42:50.844777411Z 2025-10-15T01:43:02,485 | INFO | ForkJoinPool-10-worker-1 | FlowReader | 297 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.20.1 | Total Flows read: 10000 2025-10-15T01:43:03,729 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Verification After Continuous Deletion and Addition Of Flows for 5 iterations" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Verification After Continuous Deletion and Addition Of Flows for 5 iterations 2025-10-15T01:43:04,344 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Comparison Before and After Continuous Deletion and Addition Of Flows for 5 iterations" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Comparison Before and After Continuous Deletion and Addition Of Flows for 5 iterations 2025-10-15T01:43:04,776 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Delete All Flows From Follower Node" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Delete All Flows From Follower Node 2025-10-15T01:43:05,307 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] 2025-10-15T01:43:05,324 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | newPosition > limit: (33847044 > 3006066) 2025-10-15T01:43:05,330 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,331 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,331 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,331 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,332 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-2-datastore-config-fe-1-txn-109925-1 sequence 0 (219914), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 219913 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:515) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:345) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:293) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[bundleFile:11.0.2] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-10-15T01:43:05,359 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-10-15T01:43:05,359 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: refreshing backend for shard 1 2025-10-15T01:43:05,361 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-10-15T01:43:05,361 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-10-15T01:43:05,368 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,369 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,370 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,370 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,371 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,371 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,372 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,372 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,373 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,373 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,374 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,374 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,377 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,377 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,378 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,378 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,378 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,379 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,379 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,380 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,380 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,381 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,381 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,382 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,382 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,383 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,383 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,384 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,384 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,385 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,385 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,385 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,386 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,386 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,425 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 63.67 ms 2025-10-15T01:43:05,425 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-2-datastore-config-fe-1-txn-109220-1 sequence 0 (64), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 63 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:515) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:345) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:293) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[bundleFile:11.0.2] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-10-15T01:43:05,426 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-10-15T01:43:05,427 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: refreshing backend for shard 1 2025-10-15T01:43:05,429 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-10-15T01:43:05,429 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-10-15T01:43:05,430 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,431 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,431 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,432 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,432 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,433 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,432 | WARN | opendaylight-cluster-data-shard-dispatcher-41 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,433 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,433 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,434 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,434 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,435 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,435 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,435 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,436 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,437 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,437 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,438 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,439 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,439 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,440 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,440 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,440 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,440 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,441 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,441 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,449 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,449 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 20.53 ms 2025-10-15T01:43:05,450 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,454 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-2-datastore-config-fe-1-txn-109252-1 sequence 0 (96), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 95 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:515) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:345) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:293) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[bundleFile:11.0.2] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-10-15T01:43:05,455 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-10-15T01:43:05,455 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: refreshing backend for shard 1 2025-10-15T01:43:05,458 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-10-15T01:43:05,458 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-10-15T01:43:05,473 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,474 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,474 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,475 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,476 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,476 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,476 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,477 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,485 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 26.08 ms 2025-10-15T01:43:05,486 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-2-datastore-config-fe-1-txn-109561-1 sequence 0 (405), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 404 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:515) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:345) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:293) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[bundleFile:11.0.2] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-10-15T01:43:05,486 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-10-15T01:43:05,487 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: refreshing backend for shard 1 2025-10-15T01:43:05,487 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-10-15T01:43:05,487 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-10-15T01:43:05,488 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,489 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,489 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,490 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,491 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 3.217 ms 2025-10-15T01:43:05,491 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-2-datastore-config-fe-1-txn-109555-1 sequence 0 (13), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 12 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:515) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:345) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:293) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[bundleFile:11.0.2] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-10-15T01:43:05,491 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-10-15T01:43:05,491 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: refreshing backend for shard 1 2025-10-15T01:43:05,496 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-10-15T01:43:05,496 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-10-15T01:43:05,497 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,497 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,498 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,498 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,498 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,498 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,499 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,499 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,499 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,499 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,510 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 13.71 ms 2025-10-15T01:43:05,510 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-2-datastore-config-fe-1-txn-109566-1 sequence 0 (24), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 23 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:515) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:345) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:293) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[bundleFile:11.0.2] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-10-15T01:43:05,510 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-10-15T01:43:05,511 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: refreshing backend for shard 1 2025-10-15T01:43:05,511 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-10-15T01:43:05,512 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-10-15T01:43:05,512 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,513 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,513 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,513 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,514 | WARN | opendaylight-cluster-data-shard-dispatcher-44 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,516 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,516 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,516 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,516 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,516 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,517 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,517 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,517 | WARN | opendaylight-cluster-data-shard-dispatcher-40 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,517 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,518 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,518 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,518 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,519 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,519 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,519 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,520 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,521 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,518 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=8, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 5.749 ms 2025-10-15T01:43:05,523 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} indicated sequencing mismatch on member-2-datastore-config-fe-1-txn-109714-1 sequence 0 (172), reconnecting it org.opendaylight.controller.cluster.access.commands.OutOfSequenceEnvelopeException: Expecting envelope 171 at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.checkRequestSequence(LeaderFrontendState.java:225) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:112) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:515) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:345) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:293) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommand(RaftActor.java:382) ~[bundleFile:11.0.2] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) [bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) [bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) [bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) [bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) [bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) [bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) [bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) [bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) [bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) [bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] 2025-10-15T01:43:05,524 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-10-15T01:43:05,524 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: refreshing backend for shard 1 2025-10-15T01:43:05,524 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=10, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present} 2025-10-15T01:43:05,525 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=10, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-10-15T01:43:05,529 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=9, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=10, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} in 4.073 ms 2025-10-15T01:43:05,530 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,530 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,530 | WARN | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,531 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:05,733 | WARN | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Supervisor Strategy caught unexpected exception - resuming java.util.concurrent.CancellationException: Previous action failed at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.newCancellationWithCause(JournalWriteTask.java:455) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.abortAndFailAction(JournalWriteTask.java:436) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:398) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.run(JournalWriteTask.java:305) ~[bundleFile:11.0.2] at java.lang.VirtualThread.run(VirtualThread.java:329) ~[?:?] Caused by: java.lang.IllegalArgumentException: newPosition > limit: (33847044 > 3006066) at java.nio.Buffer.createPositionException(Buffer.java:352) ~[?:?] at java.nio.Buffer.position(Buffer.java:327) ~[?:?] at java.nio.ByteBuffer.position(ByteBuffer.java:1551) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:328) ~[?:?] at java.nio.MappedByteBuffer.position(MappedByteBuffer.java:73) ~[?:?] at org.opendaylight.raft.journal.MappedByteBuf.internalNioBuffer(MappedByteBuf.java:163) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.MappedByteBuf.getBytes(MappedByteBuf.java:366) ~[bundleFile:11.0.2] at io.netty.buffer.AbstractUnpooledSlicedByteBuf.getBytes(AbstractUnpooledSlicedByteBuf.java:397) ~[bundleFile:4.2.6.Final] at org.opendaylight.raft.spi.BufThenFileOutputStream.switchToFile(BufThenFileOutputStream.java:144) ~[bundleFile:11.0.2] at org.opendaylight.raft.spi.BufThenFileOutputStream.write(BufThenFileOutputStream.java:111) ~[bundleFile:11.0.2] at java.io.ObjectOutputStream$BlockDataOutputStream.write(ObjectOutputStream.java:1875) ~[?:?] at java.io.ObjectOutputStream.write(ObjectOutputStream.java:725) ~[?:?] at org.opendaylight.raft.spi.ChunkedByteArray.copyTo(ChunkedByteArray.java:62) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.datastore.persisted.CT.writeExternal(CT.java:40) ~[bundleFile:?] at java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1478) ~[?:?] at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1449) ~[?:?] at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1194) ~[?:?] at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:358) ~[?:?] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:245) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1$LogEntryWriter.objectToBytes(EntryJournalV1.java:209) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentWriter.append(SegmentWriter.java:90) ~[bundleFile:11.0.2] at org.opendaylight.raft.journal.SegmentedEntryWriter.append(SegmentedEntryWriter.java:52) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.EntryJournalV1.appendEntry(EntryJournalV1.java:480) ~[bundleFile:11.0.2] at org.opendaylight.controller.cluster.raft.spi.JournalWriteTask.runBatch(JournalWriteTask.java:382) ~[bundleFile:11.0.2] ... 2 more 2025-10-15T01:43:05,735 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | OneForOneStrategy | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Previous action failed 2025-10-15T01:43:20,664 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | Leader | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config (Leader): At least 1 followers need to be active, Switching member-2-shard-inventory-config from Leader to IsolatedLeader 2025-10-15T01:43:20,666 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config (Leader) :- Switching from behavior Leader to IsolatedLeader, election term: 4 2025-10-15T01:43:20,666 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 194 - org.opendaylight.controller.sal-clustering-commons - 11.0.2 | RoleChangeNotifier for member-2-shard-inventory-config , received role change from Leader to IsolatedLeader 2025-10-15T01:43:20,666 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | shard-manager-config: Received role changed for member-2-shard-inventory-config from Leader to IsolatedLeader 2025-10-15T01:43:24,135 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: Current transaction member-1-datastore-config-fe-2-chn-17-txn-0-1 has timed out after 19129 ms in state COMMIT_PENDING 2025-10-15T01:43:24,136 | WARN | opendaylight-cluster-data-shard-dispatcher-38 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: Transaction member-1-datastore-config-fe-2-chn-17-txn-0-1 is still committing, cannot abort 2025-10-15T01:43:35,545 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=10, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-2-shard-inventory-config#778398497], sessionId=10, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=present}} 2025-10-15T01:43:35,546 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 11.0.2 | member-2-frontend-datastore-config: refreshing backend for shard 1 2025-10-15T01:43:39,184 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: Current transaction member-1-datastore-config-fe-2-chn-17-txn-0-1 has timed out after 15050 ms in state COMMIT_PENDING 2025-10-15T01:43:39,185 | WARN | opendaylight-cluster-data-shard-dispatcher-43 | ShardDataTree | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: Transaction member-1-datastore-config-fe-2-chn-17-txn-0-1 is still committing, cannot abort 2025-10-15T01:43:48,458 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:43:49,487 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:43:50,507 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:43:51,527 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:43:52,546 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:43:53,566 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$m], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:43:54,586 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$n], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:43:55,567 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:43:55,606 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$o], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:43:56,626 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$p], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:43:57,647 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$q], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:43:58,666 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$r], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:43:59,687 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$s], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:00,707 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$t], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:01,726 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$u], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:02,746 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$v], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:03,766 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$w], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:04,786 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$x], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:05,806 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$y], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:06,825 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$z], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:07,847 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$A], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:08,866 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$B], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:09,887 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$C], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:10,906 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$D], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:11,927 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$E], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:12,947 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$F], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:13,966 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$G], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:14,986 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$H], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:16,007 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$I], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:16,605 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:44:17,026 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$J], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:18,046 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$K], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:19,066 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$L], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:20,086 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$M], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:21,106 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$N], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:22,126 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$O], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:23,147 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$P], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:24,166 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Q], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:25,187 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$R], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:26,207 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$S], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:27,227 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$T], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:28,246 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$U], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:29,266 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$V], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:30,286 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$W], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:31,306 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$X], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:32,326 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Y], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:33,346 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Z], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:34,366 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$0], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:35,386 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$1], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:36,406 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$2], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:37,427 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$3], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:37,645 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:44:38,445 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$4], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:39,466 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$5], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:40,487 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$6], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:41,506 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$7], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:42,526 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$8], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:43,547 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$9], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:44,567 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$+], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:45,587 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$~], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:46,606 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ab], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:47,626 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$bb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:48,647 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$cb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:49,667 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$db], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:50,687 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$eb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:51,706 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$fb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:52,726 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$gb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:53,746 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$hb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:54,766 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ib], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:55,786 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$jb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:56,807 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$kb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:57,827 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$lb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:58,685 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:44:58,847 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$mb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:44:59,866 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$nb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:00,886 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ob], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:01,907 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$pb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:02,926 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$qb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:03,946 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$rb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:04,966 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$sb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:05,988 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$tb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:07,006 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ub], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:08,027 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$vb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:09,046 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$wb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:10,066 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$xb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:11,086 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$yb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:12,107 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$zb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:13,127 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ab], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:14,147 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Bb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:15,166 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Cb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:16,186 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Db], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:17,206 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Eb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:18,226 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Fb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:19,246 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Gb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:19,725 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:45:20,266 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Hb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:21,286 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ib], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:22,307 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Jb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:23,326 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Kb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:24,347 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Lb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:25,366 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Mb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:26,386 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Nb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:27,406 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ob], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:28,427 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Pb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:29,446 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Qb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:30,466 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Rb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:31,486 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Sb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:32,506 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Tb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:33,526 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ub], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:34,546 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Vb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:35,566 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Wb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:36,586 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Xb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:37,606 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Yb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:38,626 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Zb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:39,647 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$0b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:40,666 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$1b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:40,766 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:45:41,687 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$2b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:42,707 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$3b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:43,726 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$4b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:44,747 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$5b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:45,767 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$6b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:46,786 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$7b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:47,806 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$8b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:48,827 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$9b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:49,846 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$+b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:50,866 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$~b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:51,886 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ac], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:52,907 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$bc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:53,927 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$cc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:54,946 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$dc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:55,966 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ec], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:56,988 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$fc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:58,006 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$gc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:45:59,026 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$hc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:00,046 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ic], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:01,066 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$jc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:01,805 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:46:02,087 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$kc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:03,106 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$lc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:04,126 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$mc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:05,147 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$nc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:06,166 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$oc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:07,186 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$pc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:08,206 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$qc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:09,226 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$rc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:10,246 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$sc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:11,267 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$tc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:12,287 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$uc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:13,306 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$vc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:14,328 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$wc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:15,346 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$xc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:16,366 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$yc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:17,386 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$zc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:18,407 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ac], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:19,426 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Bc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:20,446 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Cc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:21,466 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Dc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:22,486 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ec], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:22,845 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:46:23,506 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Fc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:24,527 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Gc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:25,546 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Hc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:26,567 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ic], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:27,586 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Jc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:28,606 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Kc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:29,626 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Lc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:30,646 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Mc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:31,666 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Nc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:32,686 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Oc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:33,706 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Pc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:34,726 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Qc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:35,746 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Rc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:36,766 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Sc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:37,786 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Tc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:38,807 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Uc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:39,826 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Vc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:40,846 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Wc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:41,866 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Xc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:42,885 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Yc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:43,885 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:46:43,906 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Zc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:44,926 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$0c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:45,946 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$1c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:46,966 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$2c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:47,986 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$3c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:49,006 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$4c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:50,026 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$5c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:51,047 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$6c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:52,066 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$7c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:53,086 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$8c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:54,106 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$9c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:55,126 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$+c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:56,146 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$~c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:57,167 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ad], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:58,187 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$bd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:46:59,207 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$cd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:00,227 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$dd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:01,246 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ed], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:02,266 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$fd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:03,286 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$gd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:04,306 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$hd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:04,925 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:47:05,326 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$id], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:06,347 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$jd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:07,367 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$kd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:08,386 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ld], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:09,406 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$md], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:10,426 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$nd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:11,446 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$od], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:12,466 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$pd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:13,485 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$qd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:14,507 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$rd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:15,549 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$sd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:16,576 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$td], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:17,597 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ud], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:18,616 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$vd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:19,636 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$wd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:20,656 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$xd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:21,678 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$yd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:22,696 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$zd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:23,716 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ad], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:24,737 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Bd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:25,756 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Cd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:25,964 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:47:26,776 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Dd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:27,796 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ed], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:28,816 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Fd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:29,836 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Gd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:30,857 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Hd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:31,876 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Id], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:32,897 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Jd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:33,916 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Kd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:34,937 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ld], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:35,955 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Md], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:36,977 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Nd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:37,997 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Od], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:39,016 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Pd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:40,036 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Qd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:41,056 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Rd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:42,075 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Sd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:43,097 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Td], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:44,116 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ud], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:45,136 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Vd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:46,156 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Wd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:47,004 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-15 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:47:47,176 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Xd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:48,196 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Yd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:49,216 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Zd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:50,236 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$0d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:51,255 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$1d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:52,276 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$2d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:53,297 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$3d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:54,316 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$4d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:55,336 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$5d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:56,357 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$6d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:57,377 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$7d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:58,396 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$8d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:47:59,417 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$9d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:00,436 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$+d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:01,456 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$~d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:02,476 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ae], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:03,496 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$be], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:04,515 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ce], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:05,536 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$de], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:06,557 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ee], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:07,576 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$fe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:08,035 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:48:08,596 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ge], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:09,617 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$he], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:10,636 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ie], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:11,656 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$je], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:12,676 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ke], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:13,696 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$le], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:14,716 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$me], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:15,737 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ne], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:16,756 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$oe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:17,776 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$pe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:18,799 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$qe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:19,816 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$re], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:20,836 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$se], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:21,856 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$te], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:22,876 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ue], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:23,897 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ve], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:24,920 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$we], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:25,940 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$xe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:26,955 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ye], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:27,976 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ze], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:28,996 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ae], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:29,075 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:48:30,016 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Be], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:31,036 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ce], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:32,056 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$De], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:33,076 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ee], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:34,097 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Fe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:35,116 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ge], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:36,137 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$He], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:37,156 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ie], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:38,177 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Je], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:39,197 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ke], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:40,216 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Le], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:41,236 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Me], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:42,256 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ne], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:43,276 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Oe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:44,296 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Pe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:45,316 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Qe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:46,336 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Re], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:47,356 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Se], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:48,376 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Te], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:49,396 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ue], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:50,115 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:48:50,417 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ve], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:51,436 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$We], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:52,456 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Xe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:53,477 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ye], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:54,496 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ze], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:55,516 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$0e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:56,536 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$1e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:57,557 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$2e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:58,576 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$3e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:48:59,596 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$4e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:00,616 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$5e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:01,636 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$6e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:02,656 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$7e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:03,676 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$8e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:04,696 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$9e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:05,716 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$+e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:06,735 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$~e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:07,756 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$af], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:08,776 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$bf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:09,796 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$cf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:10,816 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$df], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:11,154 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:49:11,836 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ef], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:12,856 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ff], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:13,876 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$gf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:14,896 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$hf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:15,916 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$if], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:16,936 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$jf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:17,956 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$kf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:18,976 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$lf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:19,996 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$mf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:21,016 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$nf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:22,037 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$of], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:23,057 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$pf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:24,076 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$qf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:25,096 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$rf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:26,116 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$sf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:27,136 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$tf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:28,155 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$uf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:29,176 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$vf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:30,198 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$wf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:31,217 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$xf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:32,184 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:49:32,236 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$yf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:33,257 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$zf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:34,277 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Af], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:35,296 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Bf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:36,316 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Cf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:37,337 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Df], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:38,356 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ef], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:39,376 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ff], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:40,396 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Gf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:41,416 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Hf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:42,437 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$If], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:43,456 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Jf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:44,477 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Kf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:45,497 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Lf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:45,785 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Verify No Flows In Cluster After Flow Deletion" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Verify No Flows In Cluster After Flow Deletion 2025-10-15T01:49:46,516 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Mf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:47,536 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Nf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:48,556 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Of], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:49,576 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Pf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:50,597 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Qf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:51,617 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Rf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:52,636 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Sf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:53,225 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-47 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:49:53,656 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Tf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:54,676 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Uf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:55,696 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Vf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:56,717 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Wf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:57,737 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Xf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:58,757 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Yf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:49:59,780 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Zf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:00,796 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$0f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:01,816 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$1f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:02,836 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$2f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:03,856 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$3f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:04,876 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$4f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:05,896 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$5f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:06,916 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$6f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:07,936 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$7f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:08,956 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$8f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:09,976 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$9f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:10,996 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$+f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:12,017 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$~f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:13,037 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ag], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:14,056 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$bg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:14,265 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:50:15,077 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$cg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:16,097 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$dg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:17,116 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$eg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:18,136 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$fg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:19,155 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$gg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:20,176 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$hg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:21,197 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ig], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:22,216 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$jg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:23,236 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$kg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:24,257 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$lg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:33,941 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$mg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:34,966 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ng], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:35,305 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:50:35,986 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$og], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:37,008 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$pg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:38,082 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$qg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:39,202 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$rg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:40,216 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$sg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:41,253 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$tg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:42,276 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ug], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:43,298 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$vg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:44,316 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$wg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:45,337 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$xg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:50,621 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClusterHeartbeat | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Scheduled sending of heartbeat was delayed. Previous heartbeat was sent [3589] ms ago, expected interval is [1000] ms. This may cause failure detection to mark members as unreachable. The reason can be thread starvation, CPU overload, or GC. 2025-10-15T01:50:50,626 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | ClusterHeartbeat | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Scheduled sending of heartbeat was delayed. Previous heartbeat was sent [2001] ms ago, expected interval is [1000] ms. This may cause failure detection to mark members as unreachable. The reason can be thread starvation, CPU overload, or GC. 2025-10-15T01:50:50,627 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Marking node as UNREACHABLE [Member(pekko://opendaylight-cluster-data@10.30.170.126:2550, Up)]. 2025-10-15T01:50:50,627 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Marking node as UNREACHABLE [Member(pekko://opendaylight-cluster-data@10.30.170.193:2550, Up)]. 2025-10-15T01:50:50,628 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | This node is now the leader responsible for taking SBR decisions among the reachable nodes (more leaders may exist). 2025-10-15T01:50:50,630 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received UnreachableMember: memberName MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.170.126:2550 2025-10-15T01:50:50,630 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received UnreachableMember: memberName MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.170.193:2550 2025-10-15T01:50:50,630 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received UnreachableMember: memberName MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.170.126:2550 2025-10-15T01:50:50,630 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received UnreachableMember: memberName MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.170.193:2550 2025-10-15T01:50:50,631 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClusterGossip | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Ignoring received gossip status from unreachable [UniqueAddress(pekko://opendaylight-cluster-data@10.30.170.193:2550,8731972305940241042)] 2025-10-15T01:50:50,631 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClusterGossip | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Ignoring received gossip status from unreachable [UniqueAddress(pekko://opendaylight-cluster-data@10.30.170.193:2550,8731972305940241042)] 2025-10-15T01:50:50,632 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClusterGossip | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Ignoring received gossip from unreachable [UniqueAddress(pekko://opendaylight-cluster-data@10.30.170.193:2550,8731972305940241042)] 2025-10-15T01:50:50,725 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2025-10-15T01:50:50,805 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ContextChainHolderImpl | 309 - org.opendaylight.openflowplugin.impl - 0.20.1 | Entity ownership change received for node : LOCAL_OWNERSHIP_RETAINED_WITH_NO_CHANGE [wasOwner=true, isOwner=true, hasOwner=true] 2025-10-15T01:50:50,976 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | ClusterGossip | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Ignoring received gossip from unreachable [UniqueAddress(pekko://opendaylight-cluster-data@10.30.170.126:2550,606580191913167311)] 2025-10-15T01:50:50,978 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$f], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:51,494 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$yg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:51,634 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - is the new leader among reachable nodes (more leaders may exist) 2025-10-15T01:50:51,636 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Marking node as REACHABLE [Member(pekko://opendaylight-cluster-data@10.30.170.126:2550, Up)]. 2025-10-15T01:50:51,637 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received ReachableMember: memberName MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.170.126:2550 2025-10-15T01:50:51,637 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received ReachableMember: memberName MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.170.126:2550 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - Marking node as REACHABLE [Member(pekko://opendaylight-cluster-data@10.30.170.193:2550, Up)]. 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | SBR found unreachable members changed during stable-after period. Resetting timer. Now 1 unreachable members found. Downing decision will not be made before 2025-10-15T01:50:58.638356465Z. 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | SBR found all unreachable members healed during stable-after period, no downing decision necessary for now. 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-32 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | This node is not the leader any more and not responsible for taking SBR decisions. 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config: Peer address for peer member-3-shard-default-config set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-default-config 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config: Peer address for peer member-3-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-topology-config 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config: Peer address for peer member-3-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-toaster-config 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Peer address for peer member-3-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-default-operational 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Peer address for peer member-3-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-topology-operational 2025-10-15T01:50:51,647 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Peer address for peer member-3-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Peer address for peer member-3-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-config/member-3-shard-inventory-config 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-3-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-15T01:50:51,646 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received ReachableMember: memberName MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.170.193:2550 2025-10-15T01:50:51,647 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational: Peer address for peer member-3-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.170.126:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2025-10-15T01:50:51,647 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-default-config 2025-10-15T01:50:51,647 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardManager | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Received ReachableMember: memberName MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.170.193:2550 2025-10-15T01:50:51,647 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-topology-config 2025-10-15T01:50:51,647 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-config: Peer address for peer member-1-shard-default-config set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-default-config 2025-10-15T01:50:51,647 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-config: Peer address for peer member-1-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-topology-config 2025-10-15T01:50:51,647 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config 2025-10-15T01:50:51,647 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-default-operational 2025-10-15T01:50:51,647 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-default-operational: Peer address for peer member-1-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-default-operational 2025-10-15T01:50:51,647 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-topology-operational 2025-10-15T01:50:51,647 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-toaster-config 2025-10-15T01:50:51,647 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-inventory-operational 2025-10-15T01:50:51,647 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-topology-operational: Peer address for peer member-1-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-topology-operational 2025-10-15T01:50:51,647 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | ShardInformation | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | updatePeerAddress for peer member-1-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-toaster-operational 2025-10-15T01:50:51,647 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-operational: Peer address for peer member-1-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-inventory-operational 2025-10-15T01:50:51,647 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-config: Peer address for peer member-1-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-toaster-config 2025-10-15T01:50:51,648 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-toaster-operational: Peer address for peer member-1-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-operational/member-1-shard-toaster-operational 2025-10-15T01:50:51,647 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 11.0.2 | member-2-shard-inventory-config: Peer address for peer member-1-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.170.193:2550/user/shardmanager-config/member-1-shard-inventory-config 2025-10-15T01:50:52,004 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$m], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:52,515 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Dg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:52,655 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 11.0.2 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.142:2550] - is no longer leader 2025-10-15T01:50:53,020 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$n], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:53,536 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Eg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:54,040 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$o], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:54,556 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Fg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:55,060 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$p], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:55,576 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Gg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:56,080 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$q], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:56,344 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:50:56,596 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Hg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:57,100 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$r], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:57,616 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ig], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:58,120 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$s], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:58,636 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Jg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:59,140 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$t], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:50:59,656 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Kg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:00,160 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$u], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:00,676 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Lg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:01,179 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$v], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:01,696 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Mg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:02,200 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$w], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:02,716 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ng], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:03,219 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$x], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:03,736 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Og], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:04,242 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$y], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:04,756 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Pg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:05,262 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$z], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:05,776 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Qg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:06,280 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$A], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:06,796 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Rg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:07,310 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$B], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:07,815 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Sg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:08,330 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$C], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:08,856 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Tg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:09,349 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$D], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:09,876 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ug], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:10,388 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$E], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:10,896 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Vg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:11,409 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$F], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:11,916 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Wg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:12,429 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$G], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:12,936 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Xg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:13,449 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$H], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:13,956 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Yg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:14,470 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$I], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:14,976 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Zg], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:15,492 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$J], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:15,996 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$0g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:16,510 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$K], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:17,016 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$1g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:17,384 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:51:17,530 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$L], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:18,036 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$2g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:18,550 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$M], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:19,057 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$3g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:19,570 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$N], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:20,077 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$4g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:20,591 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$O], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:21,096 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$5g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:21,610 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$P], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:22,116 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$6g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:22,630 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Q], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:23,136 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$7g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:23,651 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$R], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:24,157 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$8g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:24,671 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$S], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:25,176 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$9g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:25,700 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$T], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:26,196 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$+g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:26,721 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$U], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:27,217 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$~g], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:27,741 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$V], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:28,237 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ah], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:28,761 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$W], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:29,256 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$bh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:29,780 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$X], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:30,276 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ch], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:30,801 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Y], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:31,296 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$dh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:31,820 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Z], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:32,317 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$eh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:32,842 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$0], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:33,336 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$fh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:33,861 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$1], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:34,356 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$gh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:34,881 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$2], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:35,376 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$hh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:35,902 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$3], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:36,396 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ih], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:36,921 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$4], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:37,416 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$jh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:37,940 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$5], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:38,424 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-35 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:51:38,436 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$kh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:38,961 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$6], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:39,455 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$lh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:39,981 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$7], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:40,476 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$mh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:41,194 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$8], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:42,211 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$9], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:42,215 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$nh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:43,232 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$+], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:43,236 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$oh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:44,252 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$~], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:44,256 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ph], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:45,273 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ab], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:45,277 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$qh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:46,292 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$bb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:46,296 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$rh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:47,315 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$cb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:47,316 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$sh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:48,333 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$db], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:48,335 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$th], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:49,353 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$eb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:49,356 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$uh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:50,372 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$fb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:50,375 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$vh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:51,392 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$gb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:51,396 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$wh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:52,412 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$hb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:52,416 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$xh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:53,433 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ib], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:53,436 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$yh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:54,453 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$jb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:54,455 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$zh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:55,473 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$kb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:55,476 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ah], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:56,493 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$lb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:56,495 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Bh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:57,516 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ch], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:57,517 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$mb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:58,533 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$nb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:58,535 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Dh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:59,465 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-46 | AbstractShardBackendResolver | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:309) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:307) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:231) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:73) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:110) ~[bundleFile:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:110) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:57) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-2-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:630) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleCommand(ShardManager.java:244) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:33) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:29) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:?] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:545) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.org$apache$pekko$persistence$Eventsourced$$super$aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced$$anon$4.stateReceive(Eventsourced.scala:931) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive(Eventsourced.scala:256) ~[bundleFile:?] at org.apache.pekko.persistence.Eventsourced.aroundReceive$(Eventsourced.scala:255) ~[bundleFile:?] at org.apache.pekko.persistence.AbstractPersistentActor.aroundReceive(PersistentActor.scala:305) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:280) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:241) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:253) ~[bundleFile:?] ... 5 more 2025-10-15T01:51:59,553 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ob], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:51:59,556 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Eh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:00,573 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$pb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:00,576 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Fh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:01,593 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$qb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:01,596 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Gh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:02,614 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$rb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:02,616 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Hh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:03,634 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$sb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:03,636 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ih], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:04,654 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$tb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:04,656 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Jh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:05,674 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ub], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:05,676 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Kh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:06,694 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$vb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:06,696 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Lh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:07,714 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$wb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:07,715 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Mh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:08,734 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$xb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:08,736 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Nh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:09,754 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$yb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:09,756 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Oh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:10,775 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$zb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:10,777 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ph], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:11,795 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ab], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:11,796 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Qh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:12,815 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Bb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:12,816 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Rh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:13,835 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Cb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:13,836 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Sh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:14,861 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Db], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:14,867 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Th], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:15,885 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Eb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:15,886 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Uh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:16,905 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Fb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:16,906 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Vh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:17,926 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Gb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:17,927 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Wh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:18,945 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Hb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:18,946 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Xh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:19,965 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ib], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:19,966 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Yh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:20,986 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Zh], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:20,987 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Jb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:22,006 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$0h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:22,007 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Kb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:23,026 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$1h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:23,026 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Lb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:24,046 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$2h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:24,046 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Mb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:25,066 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Nb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:25,066 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$3h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:26,086 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ob], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:26,087 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$4h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:27,106 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$5h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:27,106 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Pb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:28,126 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Qb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:28,127 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$6h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:29,146 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$7h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:29,147 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Rb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:30,166 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$8h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:30,167 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Sb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:31,186 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$9h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:31,186 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Tb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:32,206 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$+h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:32,207 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ub], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:33,227 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$~h], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:33,227 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Vb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:34,246 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ai], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:34,246 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Wb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:35,266 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Xb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:35,267 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$bi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:36,285 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ci], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:36,286 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Yb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:37,306 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$di], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:37,308 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Zb], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:38,325 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ei], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:38,327 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$0b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:39,347 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$1b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:39,348 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$fi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:40,366 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$gi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:40,367 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$2b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:41,386 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$hi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:41,387 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$3b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:42,406 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ii], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:42,408 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$4b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:43,426 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ji], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:43,428 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$5b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:44,445 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ki], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:44,447 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$6b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:45,465 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$li], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:45,467 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$7b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:46,486 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$mi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:46,487 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$8b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:47,507 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ni], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:47,508 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$9b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:48,526 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$oi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:48,527 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$+b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:49,546 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$pi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:49,547 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$~b], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:50,565 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$qi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:50,568 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ac], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:51,586 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ri], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:51,587 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$bc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:52,606 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$si], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:52,608 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$cc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:53,626 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ti], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:53,628 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$dc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:54,646 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ui], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:54,648 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ec], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:55,666 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$vi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:55,668 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$fc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:56,686 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$wi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:56,688 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$gc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:57,707 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$xi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:57,711 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$hc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:58,726 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$yi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:58,727 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ic], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:59,746 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$zi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:52:59,748 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$jc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:00,766 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ai], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:00,768 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$kc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:01,786 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Bi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:01,790 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$lc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:02,806 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ci], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:02,808 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$mc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:03,826 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Di], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:03,828 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$nc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:04,846 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ei], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:04,850 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$oc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:05,866 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Fi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:05,869 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$pc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:06,886 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Gi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:06,889 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$qc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:07,906 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Hi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:07,909 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$rc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:08,925 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ii], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:08,929 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$sc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:09,946 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ji], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:09,949 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$tc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:10,966 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ki], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:10,969 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$uc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:11,986 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Li], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:11,989 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$vc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:13,006 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Mi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:13,010 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$wc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:14,027 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ni], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:14,049 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$xc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:15,045 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Oi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:15,069 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$yc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:16,066 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Pi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:16,090 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$zc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:17,086 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Qi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:17,110 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ac], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:18,106 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ri], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:18,129 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Bc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:19,126 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Si], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:19,152 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Cc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:20,146 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ti], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:20,170 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Dc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:21,167 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ui], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:21,211 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ec], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:22,186 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Vi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:22,231 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Fc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:23,206 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Wi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:23,251 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Gc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:24,226 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Xi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:24,270 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Hc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:25,246 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Yi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:25,291 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ic], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:26,266 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Zi], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:26,329 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Jc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:27,286 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$0i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:27,351 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Kc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:28,306 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$1i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:28,371 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Lc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:29,326 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$2i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:29,390 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Mc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:30,346 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$3i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:30,411 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Nc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:31,366 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$4i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:31,431 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Oc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:32,386 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$5i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:32,451 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Pc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:33,406 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$6i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:33,471 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Qc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:34,427 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$7i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:34,492 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Rc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:35,445 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$8i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:35,512 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Sc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:36,466 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$9i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:36,531 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Tc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:37,485 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$+i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:37,552 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Uc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:38,506 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$~i], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:38,573 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Vc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:39,526 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$aj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:39,591 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Wc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:40,545 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$bj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:40,612 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Xc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:41,566 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$cj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:41,631 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Yc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:42,586 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$dj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:42,653 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Zc], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:43,606 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ej], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:43,671 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$0c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:44,626 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$fj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:44,692 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$1c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:45,647 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$gj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:45,712 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$2c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:46,666 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$hj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:46,732 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$3c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:47,686 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ij], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:47,752 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$4c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:48,706 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$jj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:48,773 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$5c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:49,726 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$kj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:49,793 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$6c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:50,746 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$lj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:50,812 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$7c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:51,766 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$mj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:51,852 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$8c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:52,786 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$nj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:52,873 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$9c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:53,805 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$oj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:53,893 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$+c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:54,826 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$pj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:54,913 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$~c], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:55,846 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$qj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:55,934 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ad], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:56,866 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$rj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:56,953 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$bd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:57,886 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$sj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:57,973 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$cd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:58,906 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$tj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:58,993 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$dd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:53:59,925 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$uj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:00,013 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ed], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:00,945 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$vj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:01,033 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$fd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:01,965 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$wj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:02,053 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$gd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:02,985 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$xj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:03,073 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$hd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:04,006 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$yj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:04,093 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$id], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:05,026 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$zj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:05,114 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$jd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:06,168 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Aj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:07,032 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$kd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:07,744 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Bj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:08,522 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ld], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:08,895 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Cj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:09,544 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$md], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:09,916 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Dj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:10,564 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$nd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:10,937 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ej], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:11,677 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$od], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:11,955 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Fj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:12,694 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$pd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:12,976 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Gj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:13,715 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$qd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:13,996 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Hj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:14,734 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$rd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:15,016 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ij], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:15,755 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$sd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:16,036 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Jj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:16,774 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$td], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:17,055 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Kj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:17,794 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ud], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:18,077 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Lj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:18,816 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$vd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:19,096 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Mj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:19,834 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$wd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:20,116 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Nj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:20,854 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$xd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:21,136 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Oj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:21,875 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$yd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:22,157 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Pj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:22,896 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$zd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:23,176 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Qj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:23,918 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ad], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:24,196 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Rj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:24,934 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Bd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:25,216 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Sj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:25,954 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Cd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:26,236 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Tj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:26,975 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Dd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:27,256 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Uj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:27,994 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ed], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:28,276 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Vj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:29,015 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Fd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:29,296 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Wj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:30,036 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Gd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:30,316 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Xj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:31,055 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Hd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:31,335 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Yj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:32,076 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Id], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:32,356 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Zj], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:33,096 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Jd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:33,376 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$0j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:34,116 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Kd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:34,396 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$1j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:35,136 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ld], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:35,416 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$2j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:36,156 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Md], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:36,436 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$3j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:37,176 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Nd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:37,456 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$4j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:38,196 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Od], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:38,476 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$5j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:39,216 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Pd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:39,495 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$6j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:40,235 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Qd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:40,515 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$7j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:41,256 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Rd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:41,536 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$8j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:42,276 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Sd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:42,556 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$9j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:43,296 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Td], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:43,576 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$+j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:44,316 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ud], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:44,596 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$~j], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:45,336 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Vd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:45,616 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ak], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:46,378 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Wd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:46,636 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$bk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:47,396 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Xd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:47,656 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ck], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:48,416 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Yd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:48,676 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$dk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:49,436 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Zd], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:49,696 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ek], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:50,456 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$0d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:50,716 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$fk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:51,476 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$1d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:51,736 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$gk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:52,496 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$2d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:52,756 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$hk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:53,517 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$3d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:53,775 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ik], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:54,536 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$4d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:54,796 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$jk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:55,557 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$5d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:55,816 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$kk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:56,577 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$6d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:56,835 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$lk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:57,601 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$7d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:57,856 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$mk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:58,627 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$8d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:58,876 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$nk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:59,647 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$9d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:54:59,896 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ok], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:00,667 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$+d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:00,916 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$pk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:01,687 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$~d], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:01,936 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$qk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:02,707 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ae], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:02,955 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$rk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:03,728 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$be], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:03,976 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$sk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:04,748 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ce], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:04,996 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$tk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:05,767 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$de], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:06,016 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$uk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:06,789 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ee], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:07,035 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$vk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:07,808 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$fe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:08,056 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$wk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:08,828 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ge], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:09,076 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$xk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:09,848 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$he], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:10,096 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$yk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:10,868 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ie], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:11,116 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$zk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:11,888 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$je], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:12,136 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ak], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:12,908 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ke], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:13,155 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Bk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:13,927 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$le], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:14,175 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ck], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:14,948 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$me], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:15,196 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Dk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:15,968 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ne], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:16,216 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ek], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:16,987 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$oe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:17,236 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Fk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:18,008 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$pe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:18,255 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Gk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:19,028 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$qe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:19,275 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Hk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:20,048 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$re], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:20,296 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ik], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:21,068 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$se], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:21,317 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Jk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:22,088 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$te], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:22,336 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Kk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:23,108 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ue], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:23,356 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Lk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:24,129 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ve], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:24,376 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Mk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:25,148 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$we], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:25,396 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Nk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:26,169 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$xe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:26,416 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ok], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:27,188 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ye], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:27,436 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Pk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:28,209 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ze], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:28,456 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Qk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:29,229 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ae], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:29,476 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Rk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:30,249 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Be], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:30,497 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Sk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:31,269 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ce], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:31,515 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Tk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:32,290 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$De], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:32,535 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Uk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:33,309 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ee], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:33,556 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Vk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:34,328 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Fe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:34,575 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Wk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:35,349 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ge], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:35,597 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Xk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:36,370 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$He], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:36,616 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Yk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:37,389 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ie], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:37,636 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Zk], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:38,409 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Je], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:38,656 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$0k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:39,429 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ke], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:39,676 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$1k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:40,449 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Le], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:40,696 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$2k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:41,469 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Me], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:41,716 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$3k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:42,489 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ne], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:42,735 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$4k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:43,509 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Oe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:43,756 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$5k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:44,529 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Pe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:44,775 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$6k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:45,550 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Qe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:45,796 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$7k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:46,570 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Re], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:46,816 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$8k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:47,590 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Se], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:47,836 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$9k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:48,610 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Te], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:48,856 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$+k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:49,630 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ue], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:49,876 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$~k], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:50,650 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ve], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:50,896 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$al], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:51,670 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$We], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:51,915 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$bl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:52,690 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Xe], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:52,936 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$cl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:53,710 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ye], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:53,955 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$dl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:54,730 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ze], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:54,975 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$el], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:55,750 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$0e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:55,996 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$fl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:56,770 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$1e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:57,015 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$gl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:57,790 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$2e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:58,036 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$hl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:58,811 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$3e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:59,055 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$il], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:55:59,830 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$4e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:00,076 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$jl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:00,850 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$5e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:01,096 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$kl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:01,870 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$6e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:02,117 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ll], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:02,891 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$7e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:03,136 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ml], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:03,911 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$8e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:04,156 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$nl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:04,931 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$9e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:05,176 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ol], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:05,950 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$+e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:06,196 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$pl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:06,970 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$~e], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:07,216 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ql], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:07,991 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$af], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:08,236 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$rl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:09,011 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$bf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:09,255 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$sl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:10,031 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$cf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:10,275 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$tl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:11,051 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$df], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:11,296 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ul], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:12,071 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ef], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:12,316 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$vl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:13,091 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$ff], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:13,335 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$wl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:14,111 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$gf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:14,355 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$xl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:15,132 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$hf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:15,376 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$yl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:16,151 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$if], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:16,396 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$zl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:17,171 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$jf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:17,415 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Al], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:18,192 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$kf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:18,436 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Bl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:19,212 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$lf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:19,455 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Cl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:20,231 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$mf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:20,476 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Dl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:21,252 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$nf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:21,496 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$El], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:22,271 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$of], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:22,515 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Fl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:23,291 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$pf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:23,536 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Gl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:24,312 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$qf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:24,556 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Hl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:25,332 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$rf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:25,576 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Il], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:26,352 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$sf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:26,560 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.PreLeader Verification" | core | 112 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.PreLeader Verification 2025-10-15T01:56:26,596 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Jl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:27,373 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$tf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:27,616 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Kl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:28,392 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$uf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:28,636 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ll], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:29,412 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$vf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:29,656 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ml], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:30,433 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$wf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:30,677 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Nl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:31,451 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$xf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:31,696 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ol], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:32,472 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$yf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:32,716 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Pl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:33,492 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$zf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:33,736 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ql], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:34,512 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Af], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:34,756 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Rl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:35,532 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Bf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:35,775 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Sl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:36,552 | INFO | opendaylight-cluster-data-shard-dispatcher-42 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Cf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:36,797 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Tl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:37,572 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Df], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:37,816 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ul], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:38,592 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ef], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:38,835 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Vl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:39,613 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Ff], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:39,855 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Wl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:40,632 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Gf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:40,876 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Xl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:41,653 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Hf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:41,896 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Yl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:42,673 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$If], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:42,916 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Zl], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:43,693 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Jf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:43,936 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$0l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:44,713 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Kf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:44,956 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$1l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:45,732 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Lf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:45,976 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$2l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:46,753 | INFO | opendaylight-cluster-data-shard-dispatcher-45 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Mf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:46,996 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$3l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:47,773 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Nf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:48,016 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$4l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:48,794 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Of], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:49,036 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$5l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:49,814 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Pf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:50,056 | INFO | opendaylight-cluster-data-shard-dispatcher-44 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$6l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:50,832 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Qf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:51,076 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$7l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:51,853 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Rf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:52,096 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$8l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:52,872 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Sf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:53,116 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$9l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:53,893 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Tf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:54,136 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$+l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:54,913 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Uf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:55,155 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$~l], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:55,933 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Vf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:56,179 | INFO | opendaylight-cluster-data-shard-dispatcher-43 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$am], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:56,953 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Wf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:57,196 | INFO | opendaylight-cluster-data-shard-dispatcher-41 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$bm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:57,973 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-3-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.126:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$Xf], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false. 2025-10-15T01:56:58,215 | INFO | opendaylight-cluster-data-shard-dispatcher-40 | Shard | 196 - org.opendaylight.controller.sal-distributed-datastore - 11.0.2 | member-2-shard-inventory-config: not currently leader, rejecting request ConnectClientRequest{target=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=2}, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data@10.30.170.193:2550/temp/_user_shardmanager-config_member-2-shard-inventory-config$cm], minVersion=POTASSIUM, maxVersion=POTASSIUM}. isLeader: true, isLeaderActive: false,isLeadershipTransferInProgress: false.